My organization is pulling data from Jira into Lighthouse for reporting purposes. Currently they are tracking Stories with PO Approval at the completion of the Sprint in the predictability calculations.
However, we have teams where SIT / UAT is outside of a Sprint, so Status's are used to move / display Stories from the completed Sprint to a Kanban Board where SIT/UAT takes place.
These Stories will not have PO Approval until the pass this type of testing. So the Lighthouse metrics show 0% predictability vs. what it really is.
After showing the Velocity Report to the Lighthouse Team, the question came up how Jira calculates this on the backend. How is it looking at this. How is this report generated.
They understand that its looking at the sum of the Story Points in at Start vs the sum of Story Points on Stories in the last column of the Sprint Board at completion, but would like more details on the program side so they understand what and how to pull the same data the same way Jira does for this report.
Hi @Matt Risch
It is a simple average of the previous sprints' completed effort.
https://confluence.atlassian.com/jirasoftwareserver/velocity-chart-938845700.html
Story Point is also stored on the relevant issues and can be retrieved using REST API if you want to feed it somewhere else.
https://developer.atlassian.com/server/jira/platform/rest-apis/
Ravi
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.