If you use Jira for Scrum, you already know the drill before every sprint review: open the burndown tab, screenshot the velocity chart from another page, manually check who completed what by filtering through issues one by one, and hope someone remembered to export the time logs. That’s not a reporting workflow. That’s archaeology.
Here’s what Jira’s native sprint reporting is missing:
|
These aren’t edge cases. They come up in every sprint retrospective. “Wait, how much scope did we add mid-sprint?” “Who had the most carryover?” “Are we getting faster or slower?” Answering them shouldn’t take 15 minutes of tab-switching. |
The Sprint Report in Snapmetrics - Real-Time Analytics brings everything together into a single page. You select a board and a sprint — that’s it. The report loads instantly and shows you:
|
What You Get |
What It Answers |
|---|---|
|
Sprint Info Card |
What was this sprint? Status, dates, logged & in-progress time |
|
Team Velocity Chart |
Are we getting faster sprint over sprint? |
|
Issue Type Breakdown |
Did bugs eat into our capacity this sprint? |
|
Scope Change Chart |
How much did the sprint grow or shrink after kickoff? |
|
Countdown Card |
How much time is left? (active sprints) |
|
Completion Metrics + Sprint Health |
Did we hit our goal? What’s the carryover? |
|
Workload by Assignee |
Was work distributed fairly across the team? |
|
Burndown Chart |
Did we burn down on schedule? |
|
Control Chart (Cycle Time) |
How long did individual issues take to complete? |
|
Individual Metrics |
Who completed what, and how efficiently? |
No tab-switching. No manual exports. No spreadsheet. Everything above is generated automatically from your live Jira data, every time you open the report.
To make this concrete, we’ll walk through each section using a real example sprint. The data below comes from a demo board called the LOTR Board (yes, the team named their sprints after Middle-earth — “The Fellowship Sprint” ran February 9–22, 2026). The names are fictional, the metrics are real data from Snapmetrics - Real-Time Analytics. Your board will look exactly the same, just with your team’s names and numbers.
|
1 |
Sprint Info Card |
The top of the report shows a summary card with the sprint name, its goal, status (Active / Closed / Future), start and end dates, and two time metrics that Jira doesn’t surface natively: total logged time and total in-progress time. This is your sprint’s identity card — everything you need before diving into the charts.
|
Sprint Info Card — The Fellowship Sprint (Closed, Feb 9–22 2026, Logged: 616h, In-Progress: 619h) |
| |
|
Field |
Value |
Why It Matters |
|---|---|---|
|
Sprint Name |
The Fellowship Sprint |
Named by the team — your board, your names |
|
Goal |
But my lord, there is no such force |
The sprint goal set in Jira, displayed for context |
|
Status |
Closed |
Active / Closed / Future |
|
Start → End |
Feb 9 → Feb 22, 2026 |
Two-week sprint |
|
Logged Time |
616h |
Total Jira worklog hours across all sprint issues |
|
In-Progress Time |
619h |
Total time issues spent in “In Progress” statuses |
|
2 |
Team Velocity Chart |
The velocity chart shows committed vs. completed story points across your last five sprints. It’s the fastest way to answer: “Are we getting better at estimating and delivering?”
|
Team Velocity Chart — Current Sprint: 35 pts | 5-Sprint Average: 32.0 pts | Committed vs. Completed bars across Sprint 1–5 |
In this example, The Fellowship Sprint committed 35 points and completed 33 points - a 94.3% completion rate, and above the team’s 5-sprint average of 32.0 pts. The trend is clearly upward.
|
💡 Key detail: The “completed” bars use your configured Definition of Done - not just Jira’s default Done status. If your team uses custom workflow statuses to mark work complete, those are reflected here accurately. |
|
3 |
Issue Types & Scope Change |
Two charts sit side by side: one showing how story points break down by issue type (Story / Task / Bug), and one showing how much scope was added or removed after the sprint started.
|
Issue Types donut chart — Story: 20 pts | Task: 12 pts | Bug: 3 pts |
Scope Change donut chart — Added: +5 pts | Removed: −2 pts | Net Change: +8.6% |
| |
|
In The Fellowship Sprint: bugs took only 3 points (9% of capacity) — healthy. Scope grew by a net +8.6% after the sprint started (5 pts added, 2 pts removed). The team absorbed that growth and still hit 94.3% completion. The Scope Change chart is the one you wish you had during every retrospective where someone asks “why didn’t we finish everything?”
|
4 |
Completion Metrics & Sprint Health |
Three cards and a health panel give you the clearest picture of the sprint’s outcome. These are all driven by your configured Definition of Done, so the numbers reflect your team’s actual workflow, not Jira’s defaults.
|
Completion cards: Completed 94.3% (33 pts) · Incomplete 5.7% (2 pts) · Carryover 8.6% (3 pts) + Sprint Health panel |
| |
|
Metric |
Value |
Reading It |
|---|---|---|
|
Completed |
94.3% (33.0 pts) |
Story points that reached a configured Done status during the sprint |
|
Incomplete |
5.7% (2.0 pts) |
Points that were in the sprint but didn’t reach Done |
|
Carryover |
8.6% (3.0 pts) |
Points rolling into the next sprint; flag for retrospective |
|
Work Done (Health) |
94.3% |
Composite sprint health signal |
|
Time Elapsed |
100% |
Sprint is closed |
|
Scope Change (Health) |
+8.6% |
Acceptable; worth noting if it becomes a recurring pattern |
|
5 |
Countdown Card (Active Sprints Only) |
For sprints that are still running, a real-time countdown shows days, hours, and minutes remaining. Simple — but during a standup when someone asks “how much time do we have?” it’s instantly useful. The Fellowship Sprint is closed so the counter shows zero, but here’s what it looks like mid-sprint:
|
Countdown Card — 1 Day · 2 Hours · 12 Minutes remaining (with progress bar) |
|
6 |
Workload by Assignee |
This chart answers the question every Scrum Master asks before sprint planning: “Is work distributed fairly, or is one person carrying everything?” Each bar shows a team member’s share of committed, added, and removed story points — colour-coded so you can see how their load changed mid-sprint.
|
Workload by Assignee — Gandalf: 23% (committed + added + removed), Aragorn: 20%, Frodo: 18%, Legolas: 10%, Sam/Gimli: 8% each, others: 5% |
In this sprint, Gandalf carried 23% of the work and also had scope added to his plate mid-sprint. Aragorn had significant scope added as well. Frodo’s load was steady at 18%. This chart often surfaces conversations that wouldn’t happen otherwise — especially for remote or async teams.
|
7 |
Burndown Chart |
The Burndown Chart plots remaining scope against an ideal burn line. It uses your configured Definition of Done — not just any status transition. When issues move to your Done statuses, the remaining line drops. When scope is added mid-sprint, it jumps. Scope creep becomes visually obvious.
|
Burndown Chart — Feb 9–22 2026 | Ideal line (dashed orange) vs. Remaining (solid blue) | Team tracked near-ideal, closing at ~2 pts remaining |
| |
The Fellowship Sprint tracked slightly above ideal for the first three days (the small scope bump around Feb 12–13 is visible — that’s the added scope showing up). The team then accelerated, crossing below the ideal line by Feb 17 and closing at just 2 points remaining. A near-perfect burn.
|
8 |
Control Chart (Cycle Time per Sprint) |
This is the most underused agile metric, and Snapmetrics - Real-Time Analytics makes it sprint-scoped. The Control Chart shows how long individual issues spent in your configured “In Progress” statuses — during this specific sprint only, not across the entire project lifetime.
|
Control Chart — Average: 4.3d | Median: 4.0d | Min: 1.8d | Max: 1w 1d | Total Issues: 11 | Line chart Feb 9–22 |
| |
|
Stat |
Value |
What to Do With It |
|---|---|---|
|
Average |
4.3 days |
Your team’s typical throughput time this sprint |
|
Median |
4.0 days |
Less affected by outliers than average — more reliable |
|
Minimum |
1.8 days |
Your floor: fast-track issues move this quickly |
|
Maximum |
1 week 1 day |
An outlier — what blocked that issue? Discuss in retro |
|
Total Issues |
11 |
Issues that fully exited In Progress during the sprint |
|
⚠️ Note: The chart only shows issues that completed their cycle time during the sprint (i.e., they left In Progress statuses before the sprint ended). Issues still in progress are excluded. “Total Issues” is not the same as total sprint issues. |
Issues are clustered by calendar day — if two issues left In Progress on the same day, they appear as a single dot. Hover to see the count. The chart also shows rolling average and standard deviation, so you can spot process instability at a glance.
|
9 |
Individual Metrics |
This is the section Scrum Masters and team leads come back to after every sprint. It breaks down contribution at the person level so you can have fair, data-backed conversations in retrospectives and 1:1s.
|
Individual Metrics table — Gandalf (8.0 pts, 0 reopens, 78h in-progress, 76h logged), Aragorn (6.0, 0, 72h, 70h), Frodo (5.0, 1 reopen, 68h, 72h) ... all 9 team members |
| |
|
Column |
What It Shows |
Configuration |
|---|---|---|
|
Completed |
Story points delivered by this person this sprint |
Respects your Definition of Done + Who Resolved setting |
|
Reopen Count |
How many of their issues were moved backward |
High count = quality signal or unclear acceptance criteria |
|
In-Progress Time |
Time their issues spent in In Progress statuses |
Uses selected calendar, bounded by sprint dates |
|
Logged Time |
Total Jira worklog hours on their sprint issues |
Direct from Jira worklogs; not affected by calendar |
For The Fellowship Sprint: Gandalf led with 8.0 completed points and 76h logged. Frodo had 1 reopen — the only quality flag in the sprint, worth a quick discussion about acceptance criteria. Every number has a drill-down arrow (›) that opens the exact list of issues behind it. Nothing is a black box.
|
📊 Export to Excel: Click the Export button in the top-right corner of the Individual Metrics section to download the full table as a spreadsheet. Useful for sprint review presentations or stakeholder reports. |
Click the ⚙️ gear icon at the top of the report to open settings. Three options control how everything is calculated:
|
Setting |
What It Does |
Default |
|---|---|---|
|
Definition of Done |
Which statuses count as “completed.” Drives: completed pts, completion %, carryover, burndown, velocity |
Jira’s Done category |
|
Control Chart Columns |
Which statuses count as “In Progress” for cycle time. Drives: Control Chart, in-progress time in Individual Metrics |
Jira’s In Progress category |
|
Who Resolved the Issue |
For Individual Metrics: who gets credit — the assignee at resolution, or the user who transitioned the issue to Done |
Assignee at resolution time |
Settings are stored per user and persist across sessions. You configure once, and every sprint report you open uses the same settings automatically.
Two ways to open it in Snapmetrics - Real-Time Analytics:
|
Path |
Steps |
Best For |
|---|---|---|
|
Global App View |
Apps → Snapmetrics → Reports → Sprint Report — then select any board |
Scrum Masters managing multiple teams or boards |
|
Project Tab |
Open the “Snapmetrics Sprint Report” tab inside any Jira project — board auto-selected |
Teams that always work within one board |
Saved selections: In the global view, your Board, Sprint, and Calendar choices are remembered between sessions. In the project tab, only the Calendar is saved (board is auto-selected, sprint defaults to the active one).
It only includes issues that completed their cycle time during the sprint - meaning they left your configured In Progress statuses before the sprint ended. Issues still in progress are excluded until they move out.
Issues exiting In Progress on the same calendar day are clustered into one dot. Hover over any filled circle to see how many are grouped inside it.
Check two settings in the gear menu:
Sprint Report requires a Scrum (sprint-capable) board. Create or link one in Jira project settings, then return to Snapmetrics - Real-Time Analytics — it will appear automatically.
Install Snapmetrics - Real-Time Analytics from the Atlassian Marketplace, open any Scrum board, and run your first Sprint Report. No configuration required to get started — the defaults work for most teams out of the box. Customize the Definition of Done and Control Chart columns when you’re ready for precision.
|
🚀 Install Snapmetrics - Real-Time Analytics → Apps → Snapmetrics → Reports → Sprint Report — select your board and sprint. Done. |
Tuncay Senturk _Snapbytes_
0 comments