I work in a team setup where Development and Quality engineers are from different organizations. However, the QEs are fully integrated into development teams.
Usually, a user story is assigned to a developer, and once the qualification is completed, we call the story done. Sometimes qualification activities are taken in a separate story, and it is considered 0 points.
How can we get a concise measure of the Dev efforts and QE efforts in the sprint report if we have to go back and refer to it at a later point of time, say the management is interested in that report.
What is your suggestion here.
Danut is right, it's better to know how you'd prefer to measure the effort first, and then seek for a proper solution. In my experience there are few options peopel tend to pick based on how granular you need to be and how disciplined the team will actually be about it:
1. Sub-tasks split by role
Keep one parent story for the user-facing requirement, then create two sub-tasks under it:
The story still rolls up for sprint planning, but you can report on Dev vs QE work by sub-task assignee, issue type, or a custom "Work Type" field. This avoids the "0-point QE stories" while making both efforts first-class and visible. You may still need to introduce some custom fields into the subtasks through to quantitively measure the effort if you go with somewthing except the standard hours estimations.
2. Custom fields on the story itself
If you don't want to split the issue, you may add two number fields on the story:
Dev EstimateQE EstimateStory Points stays as the team's combined commitment, and the two sub-fields let you slice the sprint by role afterward. Lighter on workflow overhead but loses the ability to track who actually did the work, you only get the planned split.
3. Worklogs (Jira time tracking)
If both orgs log time against the story, you can group worklogs by user, team, or "Account" and get the actual effort split, not just estimated. Most accurate, but only works if logging is really enforced. I assume, that given yourthe two-organization setup, this is also useful if billing or contract reporting comes into play.
4. Totally separate issue type for QE work
Some teams use a Test or QE Task issue type linked to the story. Cleaner reporting boundaries (you can filter the entire backlog by issue type), but adds workflow overhead and risks the QE work feeling "second-class" if not handled carefully.
On the reporting side itself:
The native Sprint Report won't give you a Dev vs QE split out of the box since it's organized by story, not by role. To get the management-facing view, you'd typically need:
Standard Jira dashboards can do basic versions of this with two-dimensional filter results gadgets, but for anything beyond that you'd want to export the data to Excel, or use a reporting that supports pivoting and grouping by multiple dimensions at once, or develop your own reporting using Forge. If you are up to marketplace apps, you may have a look to the app my Team has developed - Report Builder.
I would personally try the following to asnwer management's question: go with sub-tasks + a "Work Type" custom field (Dev / QE), and build one pivot report grouping by Sprint × Work Type × Assignee. That single view will answer "how much Dev vs QE effort went into this sprint", and you can drill in by person when needed. But I woudl suggest you first try it in a sandbox, show the management team the result, and go further having their feedback at hand.
Also, the bigger thing: once you make QE effort visible in points or hours, you'll probably notice patterns that were invisible before (QE bottlenecks, sprints where dev outran qualification capacity, etc.). That's usually more valuable than the report itself.
Rustem
How would you prefer to measure the effort of the Dev and QE teams: by total logged hours, story points, time in status, or another approach?
This is important for providing an appropriate solution.
Danut
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.