For us, an issue isn't Done until it's passed QA, and QA likes to validate an issue in production before they mark it as Done. This leads to sprint burndown charts that plateau until the end of the sprint, then resolve all at once. And it means the development team doesn't have a simple visual sense of how they're progressing towards the sprint goal.
I think Burndown keys off the last column in the agile board, right? Only when an item is Resolved does it affect the burndown. And philosophically at least, I think this is correct for our culture: we want everyone thinking that an item isn't really done until it passes QA.
But at the same time, I'd love to have a burndown that actually burns. Our current system could be hiding information from us: I don't think there's an easy way to track how regularly the development team is moving items from InProgress to InQA, and we'd all like to know that.
Is there a way to track status changes? If we start tracking Time Spent will the Burndown Chart show progress towards a sprint goal? Is there a better/different report I should be using?
Thanks so much! I've spent 2 hours reading in the forum here and learned a ton, but figured I'd ask my question straight out.
First of all I've added the Agile tag as this is certainly a discussion about process and not just about the tools you use.
So from what I've read you would like to have a burndown that burns down once an issue has been deployed to production (before QA has verified it). But on the other hand you need to keep your "official" burndown where it only burns down one an issue has been verified by QA. :-)
On the process side, as a Scrum Master, I'd suggest you book a room and brainstorm with your team about your definition of done. When is your issue really done? Once a user can start using it in production? Or once QA has verified it?
Usually the answer is the second one as QA usually verifies issues on a Test or Staging environment. The way it works in my team is pretty similar to your process. We have a bunch of automated tests that cover the most important features for us before we deploy to production. After the deploy our QA does a brief check on production and reports any bugs that he has found. He does not reopen the features as they have been delivered correctly according to our Definition of Done.
Important to note is that we have a Bugs First approach in which our bugs automatically gets added to our sprint/top of the board and we pick them up asap. There we don't have the burn down problem. :-)
On the tooling side, JIRA (Software) hasn't been build to support your use case but you could use it in a way that it kind of supports it.
You could create a second scrum board which is almost identical to your current board but just leave out the QA column(s). Make sure the "Deployed to Production" state is the final state in the new board. This way, the new board will give you a burn down that actually burns down once things are deployed to production while keeping the "official" burndown untouched.
Hope this helps, feel free to respond with any feedback or extra questions you might have.
Man, that was a quick response. You're right on both points. We do all agree that Done = passed QA, and I think our teams are similar: QA does a brief validation after a production deploy, but the heavy lifting is pre-deploy.
And secondly, I think you're correct that JIRA's not structured to give me both things I want, an acceptable definition of done which includes QA and a way to track issues' status changes throughout a sprint.
In a quest to have my cake and eat it, I searched Atlassian Marketplace, and came across this gadget: https://marketplace.atlassian.com/plugins/com.artezio.artjiraboard.jira-burndown/server/installation. Going to try and implement it tomorrow. If I understand the pitch correctly, it allows me to create a separate burndown based on Status (as vs Resolution, on which the JIRA burndown depends). That would mean that I could create a burndown for items moving from InProgress to InQA, which would show the development team how they were doing throughout the sprint.
I guess I could go crazy as well and create a separate burndown showing items moving from InQA to a new status called ReadyForRelease, or something, which would show the QA team their own burndown.
Anyway. I'm full of hope. Hoping also that I don't have to create and manage a second scrum board, but I'm not opposed to that. That'll be plan B.
Thanks again for responding!
Atlassian ranks project attributes as the third most important factor impacting performance in the category of data. It’s not surprising, since project attributes are precisely the rules used to ma...
Connect with like-minded Atlassian users at free events near you!Find a group
Connect with like-minded Atlassian users at free events near you!
Unfortunately there are no AUG chapters near you at the moment.Start an AUG
You're one step closer to meeting fellow Atlassian users at your local meet up. Learn more about AUGs