Introduction
In the previous article I already mentioned that modern IT solutions comprise of multiple tiers of components and/or micro-services. This brings a lot of benefits from both a business and a technological point of view, but at the same time can be a real headache for project and delivery managers to orchestrate.
One of the most frequently asked questions is how to oversight the progress of such multi-tier releases in the most efficient way and without spending a lot of time on it.
In this article, based on a case study with a client, I will share some good ideas that I have used.
Context of the organization
I would like to describe a release reporting solution for a system that includes 2 key components. Each component has its own codebase, teams and version/release cycle. Both cycles run in parallel and at the end, they will be integrated into a single release of business functionality. So, business value could only be achieved when both components are released to production, integration testing will be done and new features will be enabled for customers to use (Please read more about feature flagging).
I intentionally didn’t want to go into more complex examples
Timelines of the versions
Challenges
Together with the client, we defined the following metrics for the release status monitoring. They are split into 2 categories:
Delivery metrics
Quality metrics
All these metrics need to be monitored on a weekly basis in order to notify key stakeholders on the release progress and make any escalations as required.
Usually, reports are considered as non-value added activity, but they are a very necessary overhead. Therefore, the main objective was to cut the time required to maintain the report. We set ourselves a goal to spend less than 15 minutes weekly on updating and sharing.
Reporting solution
In order to achieve these objectives and goals, we decided to automate reporting via the Confluence page. The page contains set of gadgets that are imported from Jira (more information on how to import to a confluence a gadget from Jira).
The reporting page is designed in such a way that information and metrics are getting more inside out details from top to bottom. As a result, the report could be shared with different stakeholders. Therefore, anyone can stop digesting the page when the amount of input is enough, and instead continue scrolling to the bottom for more details.
Page structure
Let’s dive into details.
High-level summary
The high-level summary contains key insight about the release status:
Highlights
A short but informative description of the release status that provides more details in addition to the high-level summary section. Usually, it includes key insights that the Release Manager would like to share and with key stakeholders. Also, the Release Manager directs attention to potential risks or issues which might produce a significant impact on the timeline and/or quality of the release.
For the Highlights section, we usually use a single paragraph comprising 3-7 sentences.
Details
This section contains all required details in order to fully understand the release status and cover the above-mentioned needs. The majority of the gadgets we use are generated by Release Management for Jira (Cloud version)add-on, though one can do this with a custom JQL filter.
Delivery metrics
1. Release Status gadget (from Release Management /Cloud) - provides an aggregated release status for all versions in the release. Also, it shows the projected release date and release impediments. If you don't use the Release Management add-on, an alternative solution would be the creation of a JQL filter for all versions in the release and the visualization of the data via a pie chart or a two-dimensional filter table.2. Release Velocity. The gadget is also from Release Management. It could be built around issue count, story points or original time estimates. In our case, we created it based on issue count as the teams, that are working on our release, have different development methodologies and no single common method of estimation, or planning approach. The velocity chart shows the velocity trend for all versions aggregated on a weekly basis. Unfortunately, out of the box, Jira doesn’t have a similar functionality.
3. Burnup Chart on business value. Again, a gadget from Release Management. The burnup chart is created for all Stories, Tasks and Epics in the release. It helps to monitor scope change trend and release development progress. Unfortunately, in the standard version, burnup could not be consumed in the format of the gadget to be integrated with the confluence page.4. Release Impediments. In order to track impediments, we created a filter for all versions in the release where the Flagged field is not empty. Use used JQL query:
fixVersion in versionsOfReleasesByName(“board_name“, “release_name“) AND Flagged is not EMPTY
If you don’t use Release management add-on you can use JQL such as:
fixVersion in (“comma-separated list of versions to be monitored“) AND Flagged is not EMPTY)
The filter result is presented as a standard-issue table on the confluence page.
5. For Release Risks Tracking, we have created a separate issue type “Risk” with all required custom fields. We are adding “fixVersion” to “Risk” issue in order to attach it to a particular version or release. One advantage is that one risk can belong to multiple versions. The risk table is also simply a JQL query. JQL query is pretty similar to the previous one. The only single difference is a condition AND issueType = Risk at the end.
6. Last on the dashboard but not the least important is the Scope Change Gadget. We took it from Release Management as well. The gadget shows added and removed items during the previous 7 days for all versions in the release. Furthermore, it helps to calculate impact in hours and story points. We have tailored the scope change report to Tasks, Stories and Epics. All other items (such as Bugs, Test cases etc) are not presented in the report. If you don’t use Release Management you can use two JQL issues tables instead.
fixVersion in (“comma_separated_list_of_versions_in_release“) and (createdDate >= "start_of_period" and createdDate <= "end_of_period") or (fixVersion changed TO (“comma_separated_list_of_versions_in_release“) after "start_of_period" and fixVersion changed TO (“comma_separated_list_of_versions_in_release“) before "end_of_period")
fixVersion changed FROM (“comma_separated_list_of_versions_in_release“) after "start_of_period" and fixVersion changed from (“comma_separated_list_of_versions_in_release“) before "end_of_period"
Quality metrics
1. Issues Trends. For this chart, we used Trends gadget from Release Management with extra JQL filter issueType in (Bug, Sub-bug) in order to tailor the content. The chart helps to understand bugs creation trend during version development phase and provide an understanding of what is the most “buggy“ component. Such an approach will work for an Agile methodology when each task has to meet DoD, so testing and stabilization will not be moved to the end of the release (which happens in real life).
2. Supplementary chart: Issues Burnup (in addition to trends gadget) helps give an understanding of the number of resolved/opened bugs in any given moment, and build a projection of when all the issues will be fixed.
Compiling Delivery and Quality metrics together
The combined Details section of the report looks like:
Share report
Previously, the time to create the above report might take some time, but let’s see how long it takes now to maintain and share on a weekly basis. The flow looks like the following:
Being in this role ourselves, the above steps take a bit less than 15 minutes week by week thanks to the gadgets being auto-updated.
Conclusions
In this article, I have shared my experience in building reports for cross-project multi-component releases. This works really well in the case of my own use. I will be happy if you leave comments and share your approaches and solutions for weekly version/release progress reporting. Also, I would appreciate any suggestions on how to approve mine:)
Thanks for reading. Stay tuned!
Yuri Kudin
2 comments