How Do You Do Sprint Metrics?

Haddon Fisher
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
August 9, 2022

I have been trying to find a way to do multi-team, mostly (or entirely) automated Sprint metrics for a long time now. I had a process that mostly worked for Jira Server\DC:

  1. I built a google sheet with a set of pre-defined queries, one for each metric I wanted to collect. For example, to collect the initial commitment of a sprint I had:
    (sprint = "[SPRINT NAME]" or issuefunction in removedAfterSprintStart("[BOARD NAME]","[SPRINT NAME]")) and issuefunction not in addedAfterSprintStart("[BOARD NAME]","[SPRINT NAME]") and issueFunction in aggregateExpression("Total Story Points", "StorySize.sum()")
  2. Each team had their own tab of the query sheet, at the top of which they would define their board and sprint. Some logic would replace the "[SPRINT NAME]" and "[BOARD NAME]" with the correct variables and output the customized list of queries for their team.
  3. At the end of each sprint, each team would update their "sprint name" field and run the set of queries, putting each result into a Google form.
  4. The output of the google form was another Google sheet, which had all sorts of analytics and aggregation logic built into it.

I recently joined an organization which uses Cloud, and as it turns out, some of the key functions I relied on (namely "aggregateExpression", "removedAfterSprintStart", "incompleteInSprint", and "completeInSprint") don't exist because they're not possible via the API.

I could probably muddle through without "aggregateExpression", but retrieving the initial sprint commitment is impossible without "removedAfterSprintStart". This is kind of a key metric, so not collecting it is not really an option.

Has anyone found an add-on which provides these metrics in a queriable\exportable manner? It feels like the only solution at this point would be to pull an "initial sprint commit" and perform some kind of diff analysis.

I'd like to avoid having my scrum leads need to manually read the sprint report; that report has bugs in it, but also having humans manually read a report so they can input numbers into another report is a tremendous waste of potential.

2 answers

0 votes
Chris Boys February 2, 2023

Hi @Haddon Fisher ,

(Disclaimer: Vendor also!)

Further to Dan's response, another 3rd party alternative for consideration is Umano's Team Performance Summary.

Drawing from Jira, Confluence, Bitbucket, Github, GitLab, Slack (and others), Umano provides plug-and-play end to end optics and guidance into how teams Design, Build, Review and Collaborate on a sprint or overtime lens.  

Each team's project space visualises daily performance insights from up to 25 agile metrics (based on how broad you choose to view) for your Sprint or Kanban iterations.  You can also zoom out to view performance charts for each metric over time, as well as receive context from Umano's rolling performance benchmarks, and call-outs on key performance drivers. 

Happy to help if you have any further questions.  Good luck with automating this process and getting time back to doing what you love!

Chris

0 votes
Dan -minware-
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
October 31, 2022

Disclaimer: marketplace partner!

Hey @Haddon Fisher -

If you are open to 3rd party tools this is something we are solving for with our Sprint Insights report in minware. We try to make things straightforward so that anyone on your team can see what happened during a sprint and hold a data driven sprint retrospective.

There is too much for me to show but the 4 cornerstones of the report are:

Summary: High level overview about whether or not the team met it's sprint commitment:

miwnare_overview.png

 

Notifications: Anti-patterns that are likely to have underlying issues:

minware_notifications_new.png

 

Ticket Flow: Understand work-in-progress / impact of tickets added/removed from sprint:

miwnare_ticket_flow_added.png

Value Stream: See where time was spent on the way to completing the work:

minware_value_stream_expanded.png

After those layers are peeled away we get down to the time spent coding. minware is uniquely able to isolate merged / un-merged code to understand which tasks actually made it across the finish line:

minware_coding_eventypes.png

We can further expand each task to drill down into the pull requests and commits that went into the work or click out to version control to review. 

Let me know if we can help!

Haddon Fisher
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
February 2, 2023

Interesting - I'll take a look. Thanks!!!

Dan -minware-
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
February 3, 2023

@Haddon Fisher cheers. Ping me if you end up launching a free trial!

Suggest an answer

Log in or Sign up to answer