Picture the Sprint Retro.
Someone asks: “So… why didn’t we finish what we committed to?”
Someone else replies: “Because things happened.”
And then the room goes quiet while everyone collectively tries to remember which issues were:
In Jira, this is where the ritual begins: exporting, filtering, squinting at history tabs, and politely arguing over what “scope change” really means.
Sprint metrics are only useful when you can answer the follow-up question:
“Cool number. What issues created that number?”
Because “Scope change: +18%” is interesting… but it’s not actionable until you can say:
And yes—this matters even more when stakeholders show up with Questions.
Jira’s native Sprint Report does give you one very helpful hint:
That asterisk is basically Jira’s way of saying: “This wasn’t in the original plan. Don’t blame the burndown.”
But here’s the catch:
✅ Visible in Jira Sprint Report: Added items (kind of)
You can spot added items (via *).
❌ Not easily visible as a list: Removed items & carryover items
Teams constantly ask the community how to report “what was added mid-sprint” or track scope changes cleanly—because it’s not straightforward to extract a reliable list without workarounds (labels, manual tracking, copying keys into JQL, etc.).
So Jira gives you a signal… but not the receipts. And in retros, the receipts are everything.
When you can’t clearly trace added/removed/carryover, sprint reviews turn into debates like:
Without a clear work-item-level breakdown, you end up optimizing for storytelling, not truth.
And the sprint becomes less “inspect & adapt” and more “guess & defend.”
Time in Status includes a Sprint Performance Report built specifically to make sprint analysis more complete than Jira’s native view.
It breaks your sprint down across the core dimensions teams actually need in retros and planning:
It also respects how your board estimates work—Story Points, Original Time, or Work Item Count—so teams aren’t forced into a reporting model they don’t use.
So far, so good. But here’s what changed the game:
We’ve released Sprint metrics details: a “View data table” option that opens a detailed table behind each metric card.
In human terms, it means: You’re no longer stuck with metrics as headlines. You can open the full article.
When you open a metric’s table, you can quickly see:
For Committed and Completed, you also get a Total row that sums the estimation column—so you can reconcile the headline number with the underlying issues in seconds.
Now, instead of saying:
…you can say:
That enables the kind of retro conversations that actually improve planning:
Stop guessing whether the sprint derailed because of “interruptions.”
Open the table and name them.
Removed issues often disappear from the narrative, and later come back as “why is this still not done?” With the list visible, you can capture the real reason:
3) Carryover: make it a pattern you can fix
Carryover is one of those numbers that causes immediate emotions. But the real value is being able to answer:
Time in Status defines carryover as incomplete work moved to the next sprint (for completed sprints), and now you can trace it item-by-item.
Workload is where scope change becomes personal.
A sprint can look “fine” at the team level and still be chaotic for one person who got handed every mid-sprint fire.
The Workload section is explicitly built around committed/added/removed per assignee.
With metric details, you can see the actual issues that created that workload—so you can redistribute smarter next sprint.
Sprint metrics aren’t supposed to be decorative.
They’re supposed to help you:
And that only happens when sprint reporting includes traceability, not just totals.
If sprint reporting in your team currently involves phrases like:
…then it’s probably time to switch from asterisk archaeology to actual visibility.
You can try Time in Status by SaaSJet (trial via Atlassian Marketplace), or book a demo call to see the Sprint Report + metric tables in your own workflow.
Iryna Komarnitska_SaaSJet_
0 comments