Automated User Story Quality Tool

Miguel July 2, 2020

Hi All,

I'm a software developer. As such, I've worked on many projects that use Jira and I think of it as an awesome tool for agile teams. Are you with me on that? If you are, I would really appreciate your feedback but first let me tell you about a recent experience I had.

 

A few months ago I joined a team which had probably the worst backlog I've ever seen. It just didn't make any sense. Why?, because user stories had generic titles like "User Login", no description whatsoever and forget about prioritization, estimation or any other stuff.

 

I tried my best to explain how to write user stories (based on experience; I don't consider myself an expert on the matter), the importance of backlog grooming sessions, the importance of defining a proper ticket workflow (what's your definition of DONE?!) and other practices I consider help the whole team understand what they are building and gives management good visibility of the project.

 

Through meetings and with examples and presentations I tried to enforce some rules e.g.

  • If the issue/ticket is of type story, write it following the "persona + need + purpose" template.
  • Give your story an Acceptance Criteria.
  • If the ticket is a Bug report, add a bullet list of steps to reproduce.
  • Don't leave the tickets "In Progress" if you are not working on them (move them to pause)
  • Etc...

It was useless. I was barking at the wrong tree. Everyone just kept writing and using Jira the way they wanted. I finally quit the project because there were too many broken windows and technical debt.

 

After that frustrating experience I started thinking, Can we enforce rules automatically in Jira? When writing code I almost always use a linter or other static code analysis tools that help me identify when I'm not following coding conventions or guidelines, stylistic errors or potential bugs.

 

I looked at the Marketplace for a tool that would help me enforce at least a very basic set of rules e.g. sintaxis, format and the presence of some content on a ticket. I couldn't find anything useful. There's one story quality tool that's for Jira Server. The other that's compatible with Jira Cloud just wouldn't install.

 

A good friend of mine pointed me to an interesting tool, but it's only a prototype (the theoretical background is what makes it really interesting!). I don't want to export a CSV, startup a Python backend and make a request to get an evaluation of my stories.

So, that's how I decided to start coding a Jira Cloud Application that would validate my stories. It's a work in progress. Here it is: https://github.com/jmigueprieto/ticket-linter. I'm thinking about putting it in the Marketplace but first I want to get some feedback from the community. Specifically, I'm curious about the following:

  • Which team members are responsible for writing stories in your teams?
  • Do you use any automated user story quality tool? (Maybe I just didn't google enough and there's an awesome tool for Jira out there)
  • What type of rules do you enforce when writing a User Story?
  • Do you enforce any other rules on other type of tickets?
  • When using Jira, do you write your story in the Summary/title or the Description?

Thank you!



4 comments

Comment

Log in or Sign up to comment
Kat Warner
Marketplace Partner
Marketplace Partners provide apps and integrations available on the Atlassian Marketplace that extend the power of Atlassian products.
July 2, 2020

@Miguel - check out https://codegeist.devpost.com/

Build powerful team collaboration apps on Atlassian Cloud and compete for $315,000 in prizes - closes in 11 days!

Like Miguel likes this
Miguel July 2, 2020

Hey @Kat Warner that's awesome! I'll check that out and see if I can submit what I've been working on. Thank you!

Craeg Strong
Marketplace Partner
Marketplace Partners provide apps and integrations available on the Atlassian Marketplace that extend the power of Atlassian products.
July 3, 2020

Looks very interesting.   I think there is a need for this.   It would need to be highly configurable though.  For example, I sometimes use the "Checklist" plugin for acceptance criteria rather than listing them in the description.

In addition, I tend to make the "Summary" field something short like "User Login" but put the full AS A... I WANT... SO THAT... in the description.

Like Miguel likes this
Miguel July 4, 2020

I'll start with an opinionated set of rules, based on a conceptual framework which I'll explain in the project.

But I totally agree with you, it should be highly configurable.

Thanks for the feedback!

Wilna June 14, 2021

Hi Miguel

I have also been looking for a plug-in to do quality checks on user stories. Have you made any progress with this? 

To answer your questions:

  • Which team members are responsible for writing stories in your teams? - Mainly Business Analysts
  • Do you use any automated user story quality tool? (Maybe I just didn't google enough and there's an awesome tool for Jira out there) - we looked at two tools and tested one, but neither met our requirements
  • What type of rules do you enforce when writing a User Story? I would like to enforce the "As a <<end user>>", "I want <<action verb>>" "so that <<Value>>" syntax but would like to validate the end user from a list of user roles and perhaps the action verb from a verb list that aligns to the process/sub process level. We would have to maintain the lists, but as long as the plug in is configurable, it should be good. We would also like to look for incorrect end users such as "as a user", "as a company" etc. We would like to maintain a list maybe of end users that we do not want to be used and also ambiguous words.
  • Do you enforce any other rules on other type of tickets? Not at the moment.
  • When using Jira, do you write your story in the Summary/title or the Description? - the story name in the summary, i.e. verb+adjective+noun and then the actual story in the description.

A dynamic dashboard with quality results is a "must-have" for me: I would like to see things like the number of issues per PI, iteration, project, reporter, assignee, etc. and the quality thereof; issues that were committed to a PI or iteration that  were not completed; a graph showing improvement of quality over time, e.g. per PI; IMs logged and linked to stories that could indicate poorly written or incomplete stories. Traceability is also important - stories must be linked to features, features to capabilities, and capabilities to epics. 

I would like to do similar checks on Epics, capabilities, and features - but with different rules. For instance - does each Epic have a hypothesis statement?

That should be enough to keep you busy :)

Ravi Reddi February 23, 2024

Hi @Miguel - I am looking for a tool like this for my team to do quality check on the the user story content and the fields. How far are you with your tool. I was searching for the feature online and came across your post so curious.

Thanks

Ravi

TAGS
AUG Leaders

Atlassian Community Events