You're on your way to the next level! Join the Kudos program to earn points and save your progress.
Level 1: Seed
25 / 150 points
Next: Root
1 badge earned
Challenges come and go, but your rewards stay with you. Do more to earn more!
What goes around comes around! Share the love by gifting kudos to your peers.
Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!
Join now to unlock these features and more
The Atlassian Community can help you and your team get more value out of Atlassian products and practices.
In using the Jira Xray plug-in for Test Management, our Product Team has observed that we often see very different Requirement Test Statuses displayed in different locations in the tool. The differences we observe cannot be explained by differences in the settings (Test Plan, Fix Version/s, Test Environments, etc.) of the filters associated with the Requirement Test Statuses in the different locations. We believe the differences are likely caused by differences in the logic that calculates the Status value in each of the locations the Requirement Test Status is displayed.
We are unable to find detailed enough explanations in the Xray Documentation to determine exactly how Xray is calculating the Requirement Test Status in each location and therefore are reaching out to the community to request help in understanding the criteria and logic/ algorithms by which each Requirement Test Status is calculated and/or why we see different Requirement Test Statuses in different locations even when the filter settings for the different locations are the same.
The Jira/ Xray tool locations providing Requirement Test Statuses that we have questions about are as follows:
Screenshots of each of the Requirement Test Status locations listed above and our questions regarding how these Statuses are calculated are listed below. (Many of the same questions are asked in each location but, since the Status values differ, we figure the answers to the questions are probably different in the different locations.)
Note: in each of the descriptions/ questions below, the term “Test Execution” is understood to mean a Test Execution Issue Type that contains the Test(s) under discussion.
Thank you very much for any help you can provide in deciphering how Jira/ Xray arrives at the Requirement Test Status values it displays in each of these locations.
REQUIREMENT TEST STATUS 1
Location: “Requirement Status” field in a Requirement (Story or Bug) Issue
Xray Documentation indicates that the Requirement Status combines the results of the latest Test Execution in each Test Environment for which there is a Test Execution containing the Test to arrive at its overall value.
Assumptions:
Question:
REQUIREMENT TEST STATUS 2
Location: Overall Requirement Coverage Gadget and Overall Requirement Coverage Report with Scope = Test Plan
Assumption:
Questions:
REQUIREMENT TEST STATUS 3
Location: Overall Requirement Coverage Gadget Filter (i.e., the list of Requirements displayed when an area on the Overall Requirements Coverage Gadget representing the Requirements that have the specified Requirement Test Status is clicked).
Note: Jira/ Xray shows that the Filter used to pull the listed Requirements is
filter = < Requirements-specifying Filter> AND issue in requirementsWithStatusByTestPlan(<Requirements Status clicked on the Gadget>, "<Test Plan ID>", "", "true","","","<Requirements-specifying Filter>")
Example: filter = 12345 AND issue in requirementsWithStatusByTestPlan(OK, "PROJ-1111", "", "true","","","12345")
Notice that, though the Overall Requirements Gadget (shown in Requirement Test Status 2) indicates that there are ninety-eight (98) Requirements with Requirements Status = OK, only twenty-nine (29) Requirements are shown in the list when the green OK area in the Gadget is clicked; the other sixty-nine (69) Requirements that make up the ninety-eight (98) Requirements indicated in the Gadget show up when the “OK” Requirement Status in the Filter is replaced with “NOTRUN”.
Questions:
REQUIREMENT TEST STATUS 4
Location: Xray Traceability Report
Questions:
1. The Requirement Test Status results for the Xray Traceability Report appear to be the same as the Requirement Test Status Results displayed in the Overall Requirements Coverage Gadget (Requirement Test Status 2 above). Do the Gadget and the Report use the same criteria and algorithm to calculate the results shown in each?
If not, we have the same questions about the calculation of Requirement Test Statuses for the Xray Traceability Report as we have for the Overall Requirements Coverage Gadget:
1.1 Which Tests linked to each Requirement does the Xray include in its calculation of the Requirement Test Status shown in the Xray Traceability Report - all linked Tests or only those Tests linked to the Requirement via a “Tested by” link?
1.2 Does Xray only look at the Test Plan specified in the Test Execution or does Xray also check that the Test itself is associated with the specified Test Plan? (I.e., how does the Xray treat a case in which a Test itself is not associated with the specified Test Plan but one or more of the Test’s Test Executions are associated with the specified Test Plan?)
REQUIREMENT TEST STATUS 5
Location: Quick Filter Results bar displayed above the Xray Traceability Report
Questions:
REQUIREMENT TEST STATUS 6
Location: Xray “requirementsWithStatusByTestPlan” Filter, described in the Xray Documentation Enhanced querying with JQL section
requirementsWithStatusByTestPlan Filter parameters shown in the Enhanced querying with JQL section are:
issue in requirementsWithStatusByTestPlan(<desired Requirements Status(es), "<Test Plan ID>", "", "true","","","<Requirements-specifying Filter>")
Note: the Xray requirementsWithStatusByTestPlan Filter appears to be the driver for the Requirement Test Status values displayed in both the Overall Requirements Coverage Gadget Filter (i.e., the list of Requirements displayed when one of the areas on the Overall Requirement Coverage Gadget is clicked, see Requirement Test Status 3 above) and the Traceability Report Quick Filters bar (see Requirement Test Status 5 above). Therefore, it seems that this is a key Xray Filter and understanding the criteria and algorithm this Filter uses to arrive at the results it displays is important to being able to fully use the analytical features Xray offers.
Questions:
REQUIREMENT TEST STATUS 7
Location: in the Summary bar at the top of the Test Coverage Panel in a Requirement (Story or Test) Issue
We have experienced cases in which the Requirement Test Status shown in the Summary bar at the top of the Test Coverage Panel in a Requirement Issue displays a Requirement Status that is inconsistent with the actual Test results that contribute to the Requirement Test Status.
The example above is one such situation:
Question:
Why is the Requirement Test Status shown in the bar at the top of the Test Coverage Panel above “NOTRUN” instead of “OK”?
Adding what I found regarding Xray Requirement Test Status calculations for the benefit of others who may have similar questions.
I took Dave Liao's advice and submitted a version of the above as an Issue via the https://www.getxray.app/ Support site (SUPPORT-34980; URL: https://jira.xpand-it.com/servicedesk/customer/portal/2/SUPPORT-34980).
I met with XpandIT, who were very knowledgeable and generous with their support, and found out that what I suspected was basically what was happening (all of the below refer to a Requirement Test Status calculation with a Scope set to "Test Plan"):
- In some locations, Xray considers Tests linked to a Requirement that are not in the specified Test Plan to be NOT APPLICABLE (N/A) to its calculation of the Requirement Test Status. In these cases, Tests not is the specified Test Plan are treated as if they did not exist when calculating the Requirement Test Status).
- In other locations, Xray considers Tests linked to a Requirement that are not in the specified Test Plan to be TODO Tests (because they have not yet been run against the specified Test Plan). In these cases, Xray includes these as TODO Tests in its calculations, thereby producing a Requirement Test Status of NOTRUN.
XpandIt recognizes the second Requirement Test Status calculation (treating Tests not in the Test Plan as TODO Tests) as a Bug, defined in Story "XRAY-5593: Requirement Status is considering N/A statuses as TODO, affecting the real status of the Requirement in a Test Plan scope"; URL: https://jira.xpand-it.com/browse/XRAY-5593.
This Bug has been fixed in Xray version v4.2.6, though Xray recommends:
"update to the latest release, v4.2.11; when updating, recalculate data in the Xray configuration section → Custom Fields and do a full instance reindex at the end of the operation. As these operations can take some time and use a considerable amount of server resources, we suggest scheduling the upgrade of Xray and the recalculation of the custom fields outside of business hours."
Regarding the answer to two other questions included in the above post:
1. If a Test linked to a Requirement is not in the specified Test Plan, but the Test is included in a Test Execution that is associated with the specified Test Plan, does the Test get included in the Requirement Test Status calculation? Answer: No.
2. Which Requirement-to-Test Link-types does Xray include in its Requirement Test Status? Answer: By default, Tests linked to Requirements via "Test by" or "Created by" (used with Bug Requirements) links are included in Requirement Test Status calculations. These Link-types can be changed by an Administrator.
My thanks to Dave Liao for recommending submitting the issue described in this post as a Xray Support Issue and to XpandIT for being so generous with their time and willingness to thoroughly discuss this issue.
Thanks Dave. After talking to XpandIT (today), I feel pretty good about understanding what I'm seeing at all the different locations in the tool.
I'm looking forward to my organization upgrading to the latest version (though we're told Xray Version 5.0 is soon to come) - not sure when they'll get to it but, in the meantime at least I understand what's going on.
Thanks for your support!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thanks Dave.
I have actually started working on submitting the content of this Post as three Issues to make problem description more bite-sized.
(1. Requirement Test Status in Summary Bar at the top of a Requirement's Test Coverage Panel is inconsistent with the Test Results for the Tests listed in the Panel.
2. Requirement Test Status shown in the Overall Requirements Coverage Gadget does not match the results obtained when you click an area of the Gadget to get the list of Requirements associated with the clicked area of the Gadget.
3. Number of Requirements with a given Requirement Test Status in the Summary Bar at the top of a Traceability Report does not match the number of Requirements with that Status listed in the Traceability Report itself.)
I'll try the https://www.getxray.app/ link you provide. I'd seen that link before but thought it could only be used by those who had a support contract (i.e., the VA Jira/ Xray support team that I'm waiting to hear from). But now I see I can create an account.
So you have given me two new avenues to pursue.
Thank you for all your help and I will share what I find out!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hey there!
Not sure if anyone can help when running our traceability report we have noticed the results returned are only for our tests that are uncovered. But we have a whole range of requirements at different stages. Why is our report not giving the results needed? Any advise and help would be great!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
@Karyn Knoll - yay! I'm glad that you're getting this sorted.
I recently had to deal with a status inconsistency and an Xray field re-calculation resolved it. Knowing to do that following an Xray upgrade is a good tip!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
@Karyn Knoll - since you might have an actual issue you're trying to address, I definitely recommend hitting the Support link on https://www.getxray.app/
I'd contact your Jira admins to ensure your Xray add-on is on the latest version possible (depending on your organization's setup, detailed testing might be needed before rolling out updates to your add-ons).
Good luck! 💪 Looking forward to hearing what you find out (if you're able to share).
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thanks Dave. I did not know XpandIT was the company that makes Xray. That's great info!
I believe my organization (I support the Department of Veteran Affairs) does have some sort of support agreement with Atlassian. I did put in a Help Ticket (right before making this Post) asking them to either address the issues I raise here or put me in touch with Atlassian support but the Team that provides VA Jira/ Xray support is new and has a reputation for being over-whelmed and slow to respond. I'm still waiting to hear from them and attempting other avenues, such as this Atlassian Community Site, in the meantime.
I will definitely try the XpandIT route you suggest.
I am wondering whether the inconsistencies I've observed in Xray are intentional or if what I'm seeing is a Bug.
It seems to me that the areas I'm seeing differences in were developed by two different developers who used different logic in arriving at their results.
- One decided that Tests linked to a Requirement that did not match the Filter criteria in the Panel, Report, or Gadget displaying the Requirement Test Status (e.g., Tests included in a different Test Plan than that set in the Filter) should be ignored and only Tests (and their Test Executions) that matched the Filter criteria should be included in the calculation of the Requirement Test Status - so that if at least one Test and its Test Execution meets the Filter criteria and PASSED, the Requirement Test Status is "OK".
- The other developer appears to have decided that all Tests linked to a Requirement should be counted and any Test that was not executed under the parameters set in the Filter should be counted as "NOTRUN".
The results I'm seeing all seem to match one of these conditions or the other (i.e., only count Tests/ Test Executions that match the Filter parameters or count all Tests and consider those that have not been run under the specified criteria to be "NOTRUN").
Where this really becomes a problem is that in so many places, Jira/ Xray gives both results. E.g., when you run a Traceability Report, all of the Requirements listed in the Traceability Report show a Requirement Test Status of "OK" but the Summary Bar at the top of the Traceability Report shows some are "OK" and others are "NOTRUN" or "NOK". When I dig into the "NOTRUN" Requirements, they all have Tests linked to them that are not in the specified Test Plan or that were not executed under the specified Filter parameters.
I'm surprised these inconsistencies don't bother more people. (Others in my organization who see these results can't explain them either but consider the problem to be too convoluted to pursue.)
Thank you very much for reading my long Post and for letting me know about XpandIT. I'll try their "Talk with us" button in the next few days!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
p.s. try this link to the Xray product page, then click the "Support" link in the footer: https://www.getxray.app/
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
@Karyn Knoll - Xpand IT actually makes the Jira Xray add-on.
While you can ask questions about Xray on Atlassian Community (I know I do!), if you are (or your company is) paying for Xray, you're entitled to a certain amount of technical support. I find their team to naturally be knowledgeable about their product and its nuances.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thanks Dave. I clicked on the XpandIT link you provide above but I don't see any information related to Xray.
What is the relationship between Atlassian's Xray and XpandIT?
Thanks!
Karyn
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
@Karyn Knoll - hi Karyn! Have you reached out to Xpand IT to get clarification on anything not clearly described in their docs?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.