Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Earn badges and make progress

You're on your way to the next level! Join the Kudos program to earn points and save your progress.

Deleted user Avatar
Deleted user

Level 1: Seed

25 / 150 points

Next: Root


1 badge earned


Participate in fun challenges

Challenges come and go, but your rewards stay with you. Do more to earn more!


Gift kudos to your peers

What goes around comes around! Share the love by gifting kudos to your peers.


Rise up in the ranks

Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!


Come for the products,
stay for the community

The Atlassian Community can help you and your team get more value out of Atlassian products and practices.

Atlassian Community about banner
Community Members
Community Events
Community Groups

Where do I view pipeline test reports?

According to the current docs, maven test results are picked up automatically and test reports generated. I can see in the tear down section of my build pipeline that it is picking up the results (output below), but can't see the reports anywhere. Can someone please advise where to view test reports?


Finished scanning for test reports.

Found 22 test report files.

All test reports aggregated into 44 test cases

Uploading test results

Finished uploading test results.

11 answers

@Jason Sheedy I have found that you can only view the results of failed tests. When you have a pipeline with failing tests, you will then be able to select a dropdown that appears where the Logs header is and select Test Results. Alternatively, you can click on the x number of failures message on the unit testing step and that will take you to the test results as well.

Hope this helps!

Thanks Andrew. It seems a like a bit of an oversight to not display passing tests. I can't think of any good reason why they wouldn't include it. Maybe that's only available in the Enterprise version. :/

Like Andrew Cutler likes this

I would also like to see reports where all tests passed.

Like Andrew Cutler likes this

We now display the successful tests too. But don't generate a full report. Is there anything specific you'd want to see in a larger report for a 100% tests passing?

Hi Philip. Could you tell me where to see passing tests? I see maven test results output in the logs, but nothing appears in the bitbucket user interface.


Hi @Jason Sheedy,

Were you previously seeing failed test results as well? Or have none shown?

If you have everything already set up, they should be showing on the step like so:

Screen Shot 2018-01-15 at 5.36.39 pm.png

If you don't have it set up, you should have debugging information about this in the 'Build Teardown' section of your pipeline. Note there was a bug that prevented them from displaying, which we have now resolved so you may only seem them on new pipelines.

Screen Shot 2018-01-15 at 5.40.10 pm.png

Which directory are your test outputs going to at the moment? Right now we're only reading from specific test report directory names listed here:



Like Oleksandr Gavenko likes this

@Jason Sheedy I am a bit late to this conversation, but I noted in one of your earlier response you said,

It seems a like a bit of an oversight to not display passing tests. I can't think of any good reason why they wouldn't include it.

I would say that sometimes the most important metric for a fast-paced agile team is to know if we have failures to fix or analyse. If the tests are passing the result details will not change what happens. The details of the tests are only interesting as to what we will test. That occurs in planning the tests. 

That is my humble opinion, and in my experience, it is most useful to those who did not design, implement and execute the tests.

Like # people like this

Couple of years later, still hitting the issue of missing reports.

There's  a good reason to display results in that in jUnit format you see lots of relevant details. Such as execution times. If your tests are passing but going through the testable code but times have increased 3x, I'd like to know.

Also, some platforms include an informal coverage report alongside the test report (eg: in Go going through available local modules, how many tests are there). It's a quick and easy way to tell whether tests were added as expected without going through the whole merge request.

Like # people like this

Hi all!

I got these lines in my pipeline configuration file:

- docker exec -i app ./vendor/bin/phpunit --log-junit ./test-reports/junit.xml
- ./test-reports/**/*.xml

Im forcing an error on test expecting a report, but I get this message in teardown step:

Searching for files matching artifact pattern ./test-reports/**/*.xml
Finished scanning for test reports. Found 0 test report files.

Where Am I wrong?


montells I'm New Here Jan 19, 2023

Hello. I am facing the same behavior. did you solve this?

Hi guys, return to this old thread.

I am generating a xunit formatted .xml on every pull request.

I can see the "test" tab with the result.



but I can't see it in the "reports" section of the pull request.



Does anyone know why? am i forgetting something?


Build teardown


Hey Manuel, did you ever find an answer to this?

Hey guys,
here is the ticket Bitbucket team created for this issue:

The current issue status is "Gathering Interest", so please go and vote for it if you are still interested.


1 vote
Alex Tran Atlassian Team Jul 22, 2019

It would be helpful to also show the reports of passing test as well. 

Hi guys, I have this step in gitlab:

stage: test
- ./gradlew test -PnodeInstall --no-daemon
junit: build/test-results/test/TEST-*.xml
- build/test-results/
- build/jacoco/
expire_in: 1 day

how I transform it to bitbucket pipeline???

Hi Eduardo,

Bitbucket Pipelines automatically scans for test reports after each step of a pipeline. 

So if you have test reports output in your build, then it should be parsed and shown in the UI of the build result.

We got some details on how test reports works in Pipelines here:

Let me know how you go with that. :)



How bitbucket can show the red banner saying "failed" if the result reports errors?

My case shows that the build can still be deployed even though some tests are failed


Like Andrew Lorien likes this

@Jeannie wang the state of your step (in this case successful), is based on the exit code of your script. In this case, the command you're using to run your tests is still returning a successful exit code, even though you have a failing test.

Bitbucket Pipelines only looks for the presence of failed tests in the xUnit output of your tests.

You'll need to investigate why you test command is returning itself as successful. Perhaps you have a flag you need to enable/disable to avoid ignoring certain test results?

Like Andrew Lorien likes this

Hi guys.. from, says pipelines only support XML xUnit-compatible reports... what if i have an ExtentReport that spits out a TestAutomationReport.html at the end of every test run (and/or attaches a screenshot when a bdd step fails)... how do I integrate this in Bitbucket pipelines and actually have a Test Report tab or whatever that business users can jump into to see my colorful dashboard ?? :)

Hey Jackie,

I believe the reports should be strictly in XML format.

I don't know If I 'm missing sth here but phpunit can fail on deprecation notices(if not configured otherwise) but the tests will pass. As a result you got a failed pipeline with successful tests. Well, I don't know if it is an uncaught use case by your side, maybe it is worth mentioning it.

Hey guys,

after watching your webinar last week I started using Pipelines on our PHP API.

I was reading this thread carefully but still cant see a way to access the error reports created see:2018-03-22_1648.png


I know there is a way upload this to S3 but I cant find the documentation anymore.
Wouldn't it be the easiest way to attach this report to the email you send anyway on failure:



Hope you can help me and others here.
I already checked:

Cheers from Fishburners Sydney :)

Thks for the quick answer and sorry for misplacing my original reply. I misunderstood the interface ^^

Anyway, failing tests failing the pipeline works like it's supposed to.

In case of full success, I would see a green bar in the log output and/or interface on the left (equivalent to the big red one on failure)

No problem. I'll pass on your feedback to the rest of the team. :)

Thks !

Also, I don't have the test details on the left, close to the link to see the build configuration as per your screenshots.

Maybe I could have these in there too when they succeed ?


Anyway, thks for the follow up

Thanks @Philip HodderLooks like my test reports are being captured in the build teardown, but I don't see anything in the UI to indicate failed or passing tests. 

Found 22 test report files.
All test reports aggregated into 44 test cases
Uploading test results
Finished uploading test results.

When you say "should be showing on the step", is that displayed in the log output of the individual build or the main pipelines screen that shows all builds?

Correction. I do see failing tests, but it's not displayed in the same way as the screenshot you posted. 


@Jason Sheedy

Sorry, I just showed a subset of the page. Here's the whole result:

Screen Shot 2018-01-16 at 10.36.54 am.pngThe same view as you're looking at.



Thanks Phil. I'm not seeing the same thing. Could there be some difference because it's an academic team account? (See below)

Screenshot from 2018-01-16 12-45-08.png

@Jason Sheedy

Using an academic account is unlikely to be the issue here. It looks like you just have your log view expanded. Do you have an icon that looks like this on the top right of your screen?

Screen Shot 2018-01-16 at 1.49.59 pm.png

Larger screenshot here. Look at the top right (next to 'Download raw'):

Screen Shot 2018-01-16 at 1.50.15 pm.png

If so, when you click it you should see what I'm seeing.

If not, can you let me know what Browser + Version + OS you're using?



Oh wow. I feel kind of stupid. I see it now, but I've got to say that it's less than intuitive. Maybe a job for the UX team??

Thanks for your help.

No worries. I'll pass on the feedback. :)

Is there any permission needed to get setup before you can view test reports? base from the log it already created the report but I'm not sure why I wasn't able to access it or maybe it's just permission issue or I'm doing it the wrong way.


Screen Shot 2018-02-07 at 2.49.36 PM.png

@Philip Hodder


I cannot see the test reports too. I'm on Windows 10, with both Chrome & Firefox (latest versions).

I tried with 1 or 2 steps, but still there is no dropdown next to "Logs" to switch to test reports. The tests should be visible since half of them fail. The screen is almost the same as for the previous post (except it is not the same project).

How can I view the test reports ?


What do you see in the logs for the Build Teardown command? Are any tests being discovered?

Which directory are the tests being output to at the moment? We only check for reports in certain directories. Have a look at the documentation here to double check:





I output the tests in the "test-results" directory, and they are found in teardown, but not shown even if half of them have failed... See the image below:



Ok, I think I understand: tests are shown only if the result code is not 0 (which is not the same as all tests succeed...)


The Test Result view will be shown by default if you have failed tests + failed pipeline.

However, it looks like a bug that you can't navigate to the tests even though you have failures (but a green build). I'll open an internal ticket to fix this.



Hi @Philip Hodder

I can now see Test Result view but it doesn't show anything on my endScreen Shot 2018-02-14 at 2.25.39 PM.png

Looks like the XML isn't being parsed correctly. What build tool are you using? Are you able to show me the XML that is being output?

@Philip Hodder

I'm using codeception for automation test. Here's an example of the xml file result


<?xml version="1.0" encoding="UTF-8"?>
<testsuite name="acceptance (chrome-qa)" tests="13" assertions="83" failures="2" errors="0" time="203.458746">
<testcase file="/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php" name="loadLoginPage" class="WinePageCest" feature="load login page" assertions="0" time="9.928986"/>
<testcase file="/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php" name="loginUser" class="WinePageCest" feature="login user" assertions="0" time="7.722183"/>
<testcase file="/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php" name="redirectToMyCollectionsWines" class="WinePageCest" feature="redirect to my collections wines" assertions="0" time="7.060703"/>
<testcase file="/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php" name="verifyWinesPage" class="WinePageCest" feature="verify wines page" assertions="18" time="11.594282"/>
<testcase file="/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php" name="verifyAddWineModal" class="WinePageCest" feature="verify add wine modal" assertions="12" time="19.244257"/>
<testcase file="/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php" name="verifyCellar" class="WinePageCest" feature="verify cellar" assertions="1" time="8.617440"/>
<testcase file="/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php" name="verifyLocation" class="WinePageCest" feature="verify location" assertions="1" time="8.631113"/>
<testcase file="/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php" name="verifyBin" class="WinePageCest" feature="verify bin" assertions="1" time="8.732308"/>
<testcase file="/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php" name="verifyBrowseWinesDetailsCatalogs" class="WinePageCest" feature="verify browse wines details catalogs" assertions="24" time="35.186749">
<failure type="Codeception\Exception\ConditionalAssertionFailed">WinePageCest: Verify browse wines details catalogs
Failed asserting that on page /wines/browse/1789818
--&gt; Wines
Add Asset
2013 "Carlo V" il Rosso dell' Imperatore Merlot and Cabernet Venezie IGT
2016 ”Podere Casa Guidi” di Giuseppe Leonelli Barbera Frizzante Emilia IGT
N.V. ”Podere Casa Guidi” di Giuseppe Leonell
[Content too long to display. See complete response in '/Users/lovie/Documents/Projects/web/tests/tests/_output/raw/' directory]
--&gt; contains "vintage1".

<testcase file="/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php" name="addWine" class="WinePageCest" feature="add wine" assertions="1" time="41.462510"/>
<testcase file="/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php" name="browseWinesFromInventory" class="WinePageCest" feature="browse wines from inventory" assertions="1" time="16.380560"/>
<testcase file="/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php" name="verifyBrowseWinesDetailsInventory" class="WinePageCest" feature="verify browse wines details inventory" assertions="24" time="23.722190">
<failure type="Codeception\Exception\ConditionalAssertionFailed">WinePageCest: Verify browse wines details inventory
Failed asserting that on page /wines/collection/1772564
--&gt; Wines
Add Asset
1876 ”Podere Casa Guidi” di Giuseppe Leonelli Vino Nouvo Emilia IGT
1999 3 Bridges Sémillon Golden Mist
1999 Kevin Cabrera Grape Producer
1999 Montecalvi Alta Valle della Greve IGT

[Content too long to display. See complete response in '/Users/lovie/Documents/Projects/web/tests/tests/_output/raw/' directory]
--&gt; contains "vintage1".

<testcase file="/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php" name="logoutUser" class="WinePageCest" feature="logout user" assertions="0" time="5.175464"/>

@Philip Hodder I have the exact same problem as @Lovie Too. I can only see the "m/n test failures", but I do not see the individual failed cases.  

The reports are generated by eslint. Neither eslint -f junit, or eslint -f tap | tap-xunit produces files that file show individual failure reports on the test results page.

eslint -f junit example

<?xml version="1.0" encoding="utf-8"?>
<testsuite package="org.eslint" time="0" tests="2" errors="2" name="/home/daniel/Documents/flying_sheep/code/sheepcore-boilerplate/src/js/index.js">
<testcase time="0" name=""><failure message="Unexpected mix of &apos;&amp;&amp;&apos; and &apos;||&apos;."><![CDATA[line 20, col 16, Error - Unexpected mix of &apos;&amp;&amp;&apos; and &apos;||&apos;. (no-mixed-operators)]]></failure></testcase>
<testcase time="0" name=""><failure message="Unexpected mix of &apos;&amp;&amp;&apos; and &apos;||&apos;."><![CDATA[line 20, col 25, Error - Unexpected mix of &apos;&amp;&amp;&apos; and &apos;||&apos;. (no-mixed-operators)]]></failure></testcase>
<testsuite package="org.eslint" time="0" tests="2" errors="2" name="/home/daniel/Documents/flying_sheep/code/sheepcore-boilerplate/src/js/state/ThreeExample.js">
<testcase time="0" name="org.eslint.padded-blocks"><failure message="Block must not be padded by blank lines."><![CDATA[line 15, col 1, Error - Block must not be padded by blank lines. (padded-blocks)]]></failure></testcase>
<testcase time="0" name="org.eslint.indent"><failure message="Expected indentation of 1 tab but found 0."><![CDATA[line 15, col 1, Error - Expected indentation of 1 tab but found 0. (indent)]]></failure></testcase>

eslint -f tap | tap-xunit example

<?xml version="1.0"?>
<testsuite tests="2" failures="2" errors="0" name="Default">
<testcase name="#1 /home/daniel/Documents/flying_sheep/code/sheepcore-boilerplate/src/js/index.js">
message: Unexpected mix of '&amp;&amp;' and '||'.
severity: error
data: {"line":20,"column":16,"ruleId":"no-mixed-operators"}
messages: [{"message":"Unexpected mix of '&amp;&amp;' and '||'.","severity":"error","data":{"line":20,"column":25,"ruleId":"no-mixed-operators"}}]
<testcase name="#2 /home/daniel/Documents/flying_sheep/code/sheepcore-boilerplate/src/js/state/ThreeExample.js">
message: Block must not be padded by blank lines.
severity: error
data: {"line":15,"column":1,"ruleId":"padded-blocks"}
messages: [{"message":"Expected indentation of 1 tab but found 0.","severity":"error","data":{"line":15,"column":1,"ruleId":"indent"}}]

I also am seeing the tests results being collected only in the build teardown.
If there are no errors I don't see anything else in the web interface. Maybe I need some setup. It does says that the test results from junit (Android build) are found and uploaded.


At the moment we're only showing full results of failed tests. Can you check that you see them when you have failed tests?

Is there anything in particular you'd be interested in seeing when all your tests have passed?



Phil, do you have any feedback on why John and I don't see any details for failed tests with the XML we provided?

Nothing immediate. I'll look into that now.


Can you check to see if this is still an issue for you?

We've fixed some bugs in our parser that appear to make your XML parse properly now. I've checked the previously submitted XML files and they all now parse successfully in our parser.



Thank you, Phil!

I'll check tomorrow at work and get back to you. 


Let me know how it goes.


I tested both formats again, eslint's JUinit formatting and eslint's tap output piped through tap-xunit. Unfortunately, I still just have the x/x failures display, but no list of failures.

Is your updated parser rolled out to everybody, or might I just need to wait some more?

Thank you,

Hi Daniel,

I've opened an internal bug report. As I get the same issue when I use your XML.



Thanks, Phil!



I have an issue were Maven/BitBucket are reporting 3 times the amount of tests that I actually have. This is because the test report scanner is getting the information from three different directories. How can I configure the pipeline so only one report can be read by the Test Report Scanner?

Suggest an answer

Log in or Sign up to answer

Atlassian Community Events