Degraded performance Customers may experience intermittent errors using Community search. Our platform vendor is investigating.
We have project teams that work in sprints, at the end of the sprint, they release the code after a show and tell session, but the stories are set to status "UAT released" and we pass the released build to the client. The client then completes UAT testing and sets the stories to closed once they are satisfied that the details of the acceptance criteria have been met. We work like this so that we can see whether or not our clients are doing regular and timely UAT or not so that we don't get a bow wave of bugs at the end of the project.
Using the new Rapid boards on GH 6.2.3 you can only have one active sprint, with our working model we essentially need overlapping sprints, so how could we achieve this? I have seen posts talking about creating new boards for two teams running in parallel, but that does not really suit us. I also see we could enable GH labs parallel sprints capability, but I don't think that would be acceptible to the IT department.
The obvious solution would be to revert back to classic boards but the project managers like the planning view of the rapid boards.
Any thoughts gratefully received.
Unfortunately these are the only ways to run multiple sprints. Is there a reason multiple boards wouldn't work? I would recommend both turning on parallel sprints and using different boards for different teams when possible as this gives you much more flexibility.
The parallel sprints option has been around a while and is a very stable feature; the reason this is in labs isn't the stability, but because there are specifics still in development around estimation and reporting and these could change.
My team ran into a similar issue. What we've done is adjust our workflow so that at the end of the sprint we set the status of the issues to "Ready for Test" with a resolution of "Fixed". We have another status called "Confirmed".
We then made a Kanban board for our "client". This has columns for "Ready for Test" and "Confirmed". We also have a column named "Attention Required". We use this to house the issues that have been rejected or require more information. This way, the client can monitor this kanban board to get a list of issues to test and when tested they can drag them into the "Confirmed" area. Once confirmed, and our code it published to a staging site, we mark the status of the confirmed issues to "Closed".
This allows us to have our dev team work in sprints where we monitor velocity, and provides the client with a list of "To Do"s from the Kanban board. I'm not sure if anyone else does this, but so far it's working for us. Though, I have to admit, we just started doing this so it's a "work in progress" at the moment.
Mark, can you clarify the way you work, as far as I am interpreting it you have one workflow and one project but two boards, one for team use and one for client use? So the issues transition from one board to another to complete the issue lifecycle. Is that correct?
So do you close that sprint for the issues that are moved out of the internal board to the client board as the issues are not closed from a JIRA perspective to calculate velocity? Or do you just calculate velocity outside of JIRA and not worry about closing the issues as they leave the team board?
Thanks Dave. There is no technical reason why we can't use multiple boards other than the teams don't want them as they feel that it will introduce confusion and chaos to the projects.
I have now convinced the IT team to install the parallel sprints feature and early indictaion is that it suits their needs and is indeed a stable feature with no adverese effects.
Your understanding is correct. Let me add this. Our dev team measures velocities based on our sprints. We have a status called "Completed" (which is not the same as closed) that we used to signal a story/defect has been "finished" by the developement team. It's the right most column in the task board.
Once the sprint is completed, we publish our code to an area where the client can test it. At that point, we transition the issues to "Ready for Test". The client then tests them and transitions them to "Confirmed". It isn't until we publish these issues live that we marked them as "Closed". The client can use standard jira displays for this, but we also created a Kanban board so they can easily visualize the status of the issues.
This is probably not a standard way of doing things, but it's been working for us in the past, though this was prior to using Jira with Greenhopper. We are just starting to use this workflow with Jira so it remains to be scene how well it will work out.
This community is celebrating its one-year anniversary and Atlassian co-founder Mike Cannon-Brookes has all the feels.Read more
Can a new-to-agile team survive and thrive in a non-agile culture? If so, what advice would you give to those trying to be agile in a non-agile culture? What's the key(s) to success? Share your thoug...
Connect with like-minded Atlassian users at free events near you!Find a group
Connect with like-minded Atlassian users at free events near you!
Unfortunately there are no AUG chapters near you at the moment.Start an AUG
You're one step closer to meeting fellow Atlassian users at your local meet up. Learn more about AUGs