Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Next challenges

Recent achievements

  • Global
  • Personal

Recognition

  • Give kudos
  • Received
  • Given

Leaderboard

  • Global

Trophy case

Kudos (beta program)

Kudos logo

You've been invited into the Kudos (beta program) private group. Chat with others in the program, or give feedback to Atlassian.

View group

It's not the same without you

Join the community to find out what other Atlassian users are discussing, debating and creating.

Atlassian Community Hero Image Collage

Requirement Test Status discrepancies in Jira Xray plug-in

In using the Jira Xray plug-in for Test Management, our Product Team has observed that we often see very different Requirement Test Statuses displayed in different locations in the tool. The differences we observe cannot be explained by differences in the settings (Test Plan, Fix Version/s, Test Environments, etc.) of the filters associated with the Requirement Test Statuses in the different locations. We believe the differences are likely caused by differences in the logic that calculates the Status value in each of the locations the Requirement Test Status is displayed.

We are unable to find detailed enough explanations in the Xray Documentation to determine exactly how Xray is calculating the Requirement Test Status in each location and therefore are reaching out to the community to request help in understanding the criteria and logic/ algorithms by which each Requirement Test Status is calculated and/or why we see different Requirement Test Statuses in different locations even when the filter settings for the different locations are the same.

The Jira/ Xray tool locations providing Requirement Test Statuses that we have questions about are as follows:

  1. “Requirement Status” field in a Requirement (Story or Bug) Issue.
  2. Overall Requirement Coverage Gadget and Overall Requirement Coverage Report with Scope = Test Plan.
  3. Overall Requirement Coverage Gadget Filter (i.e., the list of Requirements displayed when an area on the Overall Requirements Coverage Gadget representing the Requirements that have the specified Requirement Test Status is clicked).
  4. Xray Traceability Report.
  5. Quick Filter Results bar displayed above the Xray Traceability Report.
  6. Xray “requirementsWithStatusByTestPlan” Filter.
  7. in the Summary bar at the top of the Test Coverage Panel in a Requirement (Story or Test) Issue.

Screenshots of each of the Requirement Test Status locations listed above and our questions regarding how these Statuses are calculated are listed below. (Many of the same questions are asked in each location but, since the Status values differ, we figure the answers to the questions are probably different in the different locations.)

Note: in each of the descriptions/ questions below, the term “Test Execution” is understood to mean a Test Execution Issue Type that contains the Test(s) under discussion.

Thank you very much for any help you can provide in deciphering how Jira/ Xray arrives at the Requirement Test Status values it displays in each of these locations.

REQUIREMENT TEST STATUS 1

Location: “Requirement Status” field in a Requirement (Story or Bug) Issue

ReqTestStatus1.jpg

Xray Documentation indicates that the Requirement Status combines the results of the latest Test Execution in each Test Environment for which there is a Test Execution containing the Test to arrive at its overall value.

Assumptions:

  1. Xray first finds the Tests linked to the Requirement and then, for each Linked Test, retrieves its latest Test Execution in each Test Environment. (Because only Tests, not Test Executions, are associated with Requirements, the only way Xray can determine that a Test Execution is testing a given Requirement is via the Tests linked to the Requirement.)
  2. The Requirement Status field calculation combines results for Test Executions
    • that have the same Fix Version/s as the Requirement,
    • that are associated with different Test Environments (i.e., one Test Execution for each Test Environments specified in the Test Executions associated with the Test is included in the Requirement Test Status calculation), but
    • other differences in Test Executions (e.g., different Test Plans, different Revisions) are ignored.

Question:

  1. Which Tests linked to the Requirement does the Xray include in its calculation of the Requirement Test Status - all linked Tests or only those Tests linked to the Requirement via a “Tested by” link?

REQUIREMENT TEST STATUS 2

Location: Overall Requirement Coverage Gadget and Overall Requirement Coverage Report with Scope = Test Plan

ReqTestStatus2.jpg

Assumption:

  1. Xray first finds the Tests linked to the Requirement and then retrieves the latest Test Execution of each Test associated with the specified Test Plan in the specified Test Environments.

Questions:

  1. Which Tests linked to each Requirement does the Xray include in its calculation of the Requirement Test Status - all linked Tests or only those Tests linked to the Requirement via a “Tested by” link?
  2. Does Xray only look at the Test Plan specified in the Test Execution or does Xray also check that the Test itself is associated with the specified Test Plan? (I.e., how does the Xray treat a case in which a Test itself is not associated with the specified Test Plan but one or more of the Test’s Test Executions are associated with the specified Test Plan?)

REQUIREMENT TEST STATUS 3

Location: Overall Requirement Coverage Gadget Filter (i.e., the list of Requirements displayed when an area on the Overall Requirements Coverage Gadget representing the Requirements that have the specified Requirement Test Status is clicked).

Note: Jira/ Xray shows that the Filter used to pull the listed Requirements is

filter = < Requirements-specifying Filter> AND issue in requirementsWithStatusByTestPlan(<Requirements Status clicked on the Gadget>, "<Test Plan ID>", "", "true","","","<Requirements-specifying Filter>")

Example: filter = 12345 AND issue in requirementsWithStatusByTestPlan(OK, "PROJ-1111", "", "true","","","12345")

ReqTestStatus3.jpg

Notice that, though the Overall Requirements Gadget (shown in Requirement Test Status 2) indicates that there are ninety-eight (98) Requirements with Requirements Status = OK, only twenty-nine (29) Requirements are shown in the list when the green OK area in the Gadget is clicked; the other sixty-nine (69) Requirements that make up the ninety-eight (98) Requirements indicated in the Gadget show up when the “OK” Requirement Status in the Filter is replaced with “NOTRUN”.

Questions:

  1. Are the Tests and Test Executions used to calculate the Requirement Test Status of each Requirement included in an area of the Overall Requirements Coverage Gadget the same as those used to calculate the Requirement Test Status of each Requirement listed when that Overall Requirements Coverage Gadget area is clicked or are different Tests and Test Executions used to calculate the Requirements Test Status of the Requirements included in a Gadget area and the Requirements listed when the Gadget area is clicked?
  2. Why is the Requirement Test Status of the Requirements shown in the Overall Requirements Gadget different than the Requirement Test Status of the Requirements listed when a Gadget area is clicked? (In the example shown, the Gadget indicates that ninety-eight (98) Requirements have Requirement Status = OK but the list shown when the green (98) Gadget area is clicked indicates that twenty-nine (29) Requirements have Requirement Status = OK. (The other sixty-nine (69) Requirements have Requirement Status = NOTRUN.)

REQUIREMENT TEST STATUS 4

Location: Xray Traceability Report

ReqTestStatus4.jpg

Questions:

1. The Requirement Test Status results for the Xray Traceability Report appear to be the same as the Requirement Test Status Results displayed in the Overall Requirements Coverage Gadget (Requirement Test Status 2 above). Do the Gadget and the Report use the same criteria and algorithm to calculate the results shown in each?

If not, we have the same questions about the calculation of Requirement Test Statuses for the Xray Traceability Report as we have for the Overall Requirements Coverage Gadget:

1.1     Which Tests linked to each Requirement does the Xray include in its calculation of the Requirement Test Status shown in the Xray Traceability Report - all linked Tests or only those Tests linked to the Requirement via a “Tested by” link?

1.2     Does Xray only look at the Test Plan specified in the Test Execution or does Xray also check that the Test itself is associated with the specified Test Plan? (I.e., how does the Xray treat a case in which a Test itself is not associated with the specified Test Plan but one or more of the Test’s Test Executions are associated with the specified Test Plan?)

REQUIREMENT TEST STATUS 5

Location: Quick Filter Results bar displayed above the Xray Traceability Report

ReqTestStatus5.jpg

Questions:

  1. The Requirement Test Status results for the Xray Traceability Report Quick Filters bar appear to be the same as the Requirement Test Status Results displayed in the Overall Requirements Coverage Gadget Filter (Requirement Test Status 3 above). Do the Gadget Filter and the Quick Filters bar use the same criteria and algorithm to calculate the results shown in each?
  2. Why is the Requirement Test Status of the Requirements shown in the Xray Traceability Report itself different than the Requirement Test Status of the Requirements shown in Traceability Quick Filters bar? (In the example shown, the Xray Traceability Report indicates that ninety-eight (98) Requirements have Requirement Status = OK but the Traceability Report Quick Filters bar indicates that twenty-nine (29) Requirements have Requirement Status = OK and sixty-nine (69) Requirements have Requirement Status = NOTRUN.

REQUIREMENT TEST STATUS 6

Location: Xray “requirementsWithStatusByTestPlan” Filter, described in the Xray Documentation Enhanced querying with JQL section

requirementsWithStatusByTestPlan Filter parameters shown in the Enhanced querying with JQL section are:

issue in requirementsWithStatusByTestPlan(<desired Requirements Status(es), "<Test Plan ID>", "", "true","","","<Requirements-specifying Filter>")

Note: the Xray requirementsWithStatusByTestPlan Filter appears to be the driver for the Requirement Test Status values displayed in both the Overall Requirements Coverage Gadget Filter (i.e., the list of Requirements displayed when one of the areas on the Overall Requirement Coverage Gadget is clicked, see Requirement Test Status 3 above) and the Traceability Report Quick Filters bar (see Requirement Test Status 5 above). Therefore, it seems that this is a key Xray Filter and understanding the criteria and algorithm this Filter uses to arrive at the results it displays is important to being able to fully use the analytical features Xray offers.

Questions:

  1. Which Tests linked to a Requirement does the requirementsWithStatusByTestPlan Filter include when calculating the Requirement Test Status of the Requirement – all linked Tests or only those Tests linked to the Requirement via a “Tested by” link?
  2. Also, in determining which Tests to include in its Requirement Status calculation, does Xray only look for Test Executions that are associated with the specified Test Plan or does Xray also check that the Test itself is associated with the specified Test Plan? (I.e., how does the Xray requirementsWithStatusByTestPlan Filter treat a case in which a Test itself is not associated with a specified Test Plan but one or more Test Executions that include the Test are associated with the specified Test Plan?)

REQUIREMENT TEST STATUS 7

Location: in the Summary bar at the top of the Test Coverage Panel in a Requirement (Story or Test) Issue

ReqTestStatus7.jpg

We have experienced cases in which the Requirement Test Status shown in the Summary bar at the top of the Test Coverage Panel in a Requirement Issue displays a Requirement Status that is inconsistent with the actual Test results that contribute to the Requirement Test Status.

The example above is one such situation:

  • The Requirement Issue (a Story) is linked to seven (7) Tests via seven (7) “Tested by” links.
  • Four (4) of the linked Tests are in the specified Test Plan; three of the linked Tests are associated with a different Test Plan.
  • All four (4) of the linked Tests in the specified Test Plan Passed in the specified Test Environment. All four (4) of these Tests are shown as Passed in the Test Coverage Panel list below the Summary bar; the remaining three (3) Tests, which are not associated with the specified Test Plan, are shown as N/A in the Test Coverage Panel list.
  • Additionally, an Overall Requirements Coverage Report that was run using the same Test Plan and Test Environment as that shown in the Test Coverage Panel above indicated that this Requirement Passed four (4) of a total of four (4) Tests. When the “4” in the Total Tests list was clicked, the result listed the same four (4) Tests shown in the Test Coverage Panel, which all have a Status = Passed. (I.e., the Overall Requirements Coverage Report correctly included only the four (4) linked Tests that are associated with the specified Test Plan and Test Environment; the three (3) linked Tests not associated with the specified Test Plan were ignored.)
  • Given the set of circumstances described above, we expect the Requirement Status shown in the Test Coverage Panel Summary bar to be “OK”, yet the Requirement Status shown in the bar is “NOTRUN”.

Question:

Why is the Requirement Test Status shown in the bar at the top of the Test Coverage Panel above “NOTRUN” instead of “OK”?

2 comments

Dave Liao Community Leader Mar 11, 2021

@Karyn Knoll - hi Karyn! Have you reached out to Xpand IT to get clarification on anything not clearly described in their docs?

Thanks Dave. I clicked on the XpandIT link you provide above but I don't see any information related to Xray.

What is the relationship between Atlassian's Xray and XpandIT?

Thanks!

Karyn

Dave Liao Community Leader Mar 11, 2021

@Karyn Knoll - Xpand IT actually makes the Jira Xray add-on.

While you can ask questions about Xray on Atlassian Community (I know I do!), if you are (or your company is) paying for Xray, you're entitled to a certain amount of technical support. I find their team to naturally be knowledgeable about their product and its nuances.

Dave Liao Community Leader Mar 11, 2021

p.s. try this link to the Xray product page, then click the "Support" link in the footer: https://www.getxray.app/

Thanks Dave. I did not know XpandIT was the company that makes Xray. That's great info!

I believe my organization (I support the Department of Veteran Affairs) does have some sort of support agreement with Atlassian. I did put in a Help Ticket (right before making this Post) asking them to either address the issues I raise here or put me in touch with Atlassian support but the Team that provides VA Jira/ Xray support is new and has a reputation for being over-whelmed and slow to respond. I'm still waiting to hear from them and attempting other avenues, such as this Atlassian Community Site, in the meantime.

I will definitely try the XpandIT route you suggest.

I am wondering whether the inconsistencies I've observed in Xray are intentional or if what I'm seeing is a Bug.

It seems to me that the areas I'm seeing differences in were developed by two different developers who used different logic in arriving at their results. 
- One decided that Tests linked to a Requirement that did not match the Filter criteria in the Panel, Report, or Gadget displaying the Requirement Test Status (e.g., Tests included in a different Test Plan than that set in the Filter) should be ignored and only Tests (and their Test Executions) that matched the Filter criteria should be included in the calculation of the Requirement Test Status - so that if at least one Test and its Test Execution meets the Filter criteria and PASSED, the Requirement Test Status is "OK".
- The other developer appears to have decided that all Tests linked to a Requirement should be counted and any Test that was not executed under the parameters set in the Filter should be counted as "NOTRUN".

The results I'm seeing all seem to match one of these conditions or the other (i.e., only count Tests/ Test Executions that match the Filter parameters or count all Tests and consider those that have not been run under the specified criteria to be "NOTRUN").

Where this really becomes a problem is that in so many places, Jira/ Xray gives both results. E.g., when you run a Traceability Report, all of the Requirements listed in the Traceability Report show a Requirement Test Status of "OK" but the Summary Bar at the top of the Traceability Report shows some are "OK" and others are "NOTRUN" or "NOK". When I dig into the "NOTRUN" Requirements, they all have Tests linked to them that are not in the specified Test Plan or that were not executed under the specified Filter parameters.

I'm surprised these inconsistencies don't bother more people. (Others in my organization who see these results can't explain them either but consider the problem to be too convoluted to pursue.) 

Thank you very much for reading my long Post and for letting me know about XpandIT. I'll try their "Talk with us" button in the next few days!

Dave Liao Community Leader Mar 11, 2021

@Karyn Knoll - since you might have an actual issue you're trying to address, I definitely recommend hitting the Support link on https://www.getxray.app/ 

I'd contact your Jira admins to ensure your Xray add-on is on the latest version possible (depending on your organization's setup, detailed testing might be needed before rolling out updates to your add-ons).

Good luck! 💪 Looking forward to hearing what you find out (if you're able to share).

Thanks Dave.

I have actually started working on submitting the content of this Post as three Issues to make problem description more bite-sized.

(1. Requirement Test Status in Summary Bar at the top of a Requirement's Test Coverage Panel is inconsistent with the Test Results for the Tests listed in the Panel.

2. Requirement Test Status shown in the Overall Requirements Coverage Gadget does not match the results obtained when you click an area of the Gadget to get the list of Requirements associated with the clicked area of the Gadget.

3. Number of Requirements with a given Requirement Test Status in the Summary Bar at the top of a Traceability Report does not match the number of Requirements with that Status listed in the Traceability Report itself.)

I'll try the https://www.getxray.app/  link you provide. I'd seen that link before but thought it could only be used by those who had a support contract (i.e., the VA Jira/ Xray support team that I'm waiting to hear from). But now I see I can create an account.

So you have given me two new avenues to pursue.

Thank you for all your help and I will share what I find out!

Like Dave Liao likes this

@Dave Liao 

Adding what I found regarding Xray Requirement Test Status calculations for the benefit of others who may have similar questions.

I took Dave Liao's advice and submitted a version of the above as an Issue via the https://www.getxray.app/ Support site (SUPPORT-34980; URL: https://jira.xpand-it.com/servicedesk/customer/portal/2/SUPPORT-34980).

I met with XpandIT, who were very knowledgeable and generous with their support, and found out that what I suspected was basically what was happening (all of the below refer to a Requirement Test Status calculation with a Scope set to "Test Plan"):
- In some locations, Xray considers Tests linked to a Requirement that are not in the specified Test Plan to be NOT APPLICABLE (N/A) to its calculation of the Requirement Test Status. In these cases, Tests not is the specified Test Plan are treated as if they did not exist when calculating the Requirement Test Status).
- In other locations, Xray considers Tests linked to a Requirement that are not in the specified Test Plan to be TODO Tests (because they have not yet been run against the specified Test Plan). In these cases, Xray includes these as TODO Tests in its calculations, thereby producing a Requirement Test Status of NOTRUN.

XpandIt recognizes the second Requirement Test Status calculation (treating Tests not in the Test Plan as TODO Tests) as a Bug, defined in Story "XRAY-5593: Requirement Status is considering N/A statuses as TODO, affecting the real status of the Requirement in a Test Plan scope"; URL: https://jira.xpand-it.com/browse/XRAY-5593.

This Bug has been fixed in Xray version v4.2.6, though Xray recommends:

"update to the latest release, v4.2.11; when updating, recalculate data in the Xray configuration section → Custom Fields and do a full instance reindex at the end of the operation. As these operations can take some time and use a considerable amount of server resources, we suggest scheduling the upgrade of Xray and the recalculation of the custom fields outside of business hours."

Regarding the answer to two other questions included in the above post:

1. If a Test linked to a Requirement is not in the specified Test Plan, but the Test is included in a Test Execution that is associated with the specified Test Plan, does the Test get included in the Requirement Test Status calculation? Answer: No.

2. Which Requirement-to-Test Link-types does Xray include in its Requirement Test Status? Answer: By default, Tests linked to Requirements via "Test by" or "Created by" (used with Bug Requirements) links are included in Requirement Test Status calculations. These Link-types can be changed by an Administrator.

My thanks to Dave Liao for recommending submitting the issue described in this post as a Xray Support Issue and to XpandIT for being so generous with their time and willingness to thoroughly discuss this issue.

Like Dave Liao likes this
Dave Liao Community Leader Mar 18, 2021

@Karyn Knoll - yay! I'm glad that you're getting this sorted.

I recently had to deal with a status inconsistency and an Xray field re-calculation resolved it. Knowing to do that following an Xray upgrade is a good tip!

Thanks Dave. After talking to XpandIT (today), I feel pretty good about understanding what I'm seeing at all the different locations in the tool.

I'm looking forward to my organization upgrading to the latest version (though we're told Xray Version 5.0 is soon to come) - not sure when they'll get to it but, in the meantime at least I understand what's going on.

Thanks for your support!

Like Dave Liao likes this

Comment

Log in or Sign up to comment
TAGS
Community showcase
Published in Marketplace Apps & Integrations

Bitbucket Smart Commits vs. Genius Commits - What's the difference?

If you already heard about Smart Commits in Bitbucket, know that you just stumbled upon something even better (and smarter!): Genius Commits by Better DevOps Automation for Jira Data Center (+ Server...

114 views 0 2
Read article

Community Events

Connect with like-minded Atlassian users at free events near you!

Find an event

Connect with like-minded Atlassian users at free events near you!

Unfortunately there are no Community Events near you at the moment.

Host an event

You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events

Events near you