I have been trying to find a way to do multi-team, mostly (or entirely) automated Sprint metrics for a long time now. I had a process that mostly worked for Jira Server\DC:
(sprint = "[SPRINT NAME]" or issuefunction in removedAfterSprintStart("[BOARD NAME]","[SPRINT NAME]")) and issuefunction not in addedAfterSprintStart("[BOARD NAME]","[SPRINT NAME]") and issueFunction in aggregateExpression("Total Story Points", "StorySize.sum()")
I recently joined an organization which uses Cloud, and as it turns out, some of the key functions I relied on (namely "aggregateExpression", "removedAfterSprintStart", "incompleteInSprint", and "completeInSprint") don't exist because they're not possible via the API.
I could probably muddle through without "aggregateExpression", but retrieving the initial sprint commitment is impossible without "removedAfterSprintStart". This is kind of a key metric, so not collecting it is not really an option.
Has anyone found an add-on which provides these metrics in a queriable\exportable manner? It feels like the only solution at this point would be to pull an "initial sprint commit" and perform some kind of diff analysis.
I'd like to avoid having my scrum leads need to manually read the sprint report; that report has bugs in it, but also having humans manually read a report so they can input numbers into another report is a tremendous waste of potential.
Hi @Haddon Fisher ,
(Disclaimer: Vendor also!)
Further to Dan's response, another 3rd party alternative for consideration is Umano's Team Performance Summary.
Drawing from Jira, Confluence, Bitbucket, Github, GitLab, Slack (and others), Umano provides plug-and-play end to end optics and guidance into how teams Design, Build, Review and Collaborate on a sprint or overtime lens.
Each team's project space visualises daily performance insights from up to 25 agile metrics (based on how broad you choose to view) for your Sprint or Kanban iterations. You can also zoom out to view performance charts for each metric over time, as well as receive context from Umano's rolling performance benchmarks, and call-outs on key performance drivers.
Happy to help if you have any further questions. Good luck with automating this process and getting time back to doing what you love!
Chris
Disclaimer: marketplace partner!
Hey @Haddon Fisher -
If you are open to 3rd party tools this is something we are solving for with our Sprint Insights report in minware. We try to make things straightforward so that anyone on your team can see what happened during a sprint and hold a data driven sprint retrospective.
There is too much for me to show but the 4 cornerstones of the report are:
Summary: High level overview about whether or not the team met it's sprint commitment:
Notifications: Anti-patterns that are likely to have underlying issues:
Ticket Flow: Understand work-in-progress / impact of tickets added/removed from sprint:
Value Stream: See where time was spent on the way to completing the work:
After those layers are peeled away we get down to the time spent coding. minware is uniquely able to isolate merged / un-merged code to understand which tasks actually made it across the finish line:
We can further expand each task to drill down into the pull requests and commits that went into the work or click out to version control to review.
Let me know if we can help!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Interesting - I'll take a look. Thanks!!!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
@Haddon Fisher cheers. Ping me if you end up launching a free trial!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.