According to the current docs, maven test results are picked up automatically and test reports generated. I can see in the tear down section of my build pipeline that it is picking up the results (output below), but can't see the reports anywhere. Can someone please advise where to view test reports?
Finished scanning for test reports.
Found 22 test report files.
All test reports aggregated into 44 test cases
Uploading test results
Finished uploading test results.
Hey guys,
here is the ticket Bitbucket team created for this issue: https://jira.atlassian.com/browse/BCLOUD-15716
The current issue status is "Gathering Interest", so please go and vote for it if you are still interested.
Thanks!
@Jason Sheedy I have found that you can only view the results of failed tests. When you have a pipeline with failing tests, you will then be able to select a dropdown that appears where the Logs header is and select Test Results. Alternatively, you can click on the x number of failures message on the unit testing step and that will take you to the test results as well.
Hope this helps!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thanks Andrew. It seems a like a bit of an oversight to not display passing tests. I can't think of any good reason why they wouldn't include it. Maybe that's only available in the Enterprise version. :/
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
I would also like to see reports where all tests passed.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
We now display the successful tests too. But don't generate a full report. Is there anything specific you'd want to see in a larger report for a 100% tests passing?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Philip. Could you tell me where to see passing tests? I see maven test results output in the logs, but nothing appears in the bitbucket user interface.
thanks
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi @Jason Sheedy,
Were you previously seeing failed test results as well? Or have none shown?
If you have everything already set up, they should be showing on the step like so:
If you don't have it set up, you should have debugging information about this in the 'Build Teardown' section of your pipeline. Note there was a bug that prevented them from displaying, which we have now resolved so you may only seem them on new pipelines.
Which directory are your test outputs going to at the moment? Right now we're only reading from specific test report directory names listed here: https://confluence.atlassian.com/bitbucket/test-reporting-in-pipelines-939708543.html
Thanks,
Phil
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
@Jason Sheedy I am a bit late to this conversation, but I noted in one of your earlier response you said,
It seems a like a bit of an oversight to not display passing tests. I can't think of any good reason why they wouldn't include it.
I would say that sometimes the most important metric for a fast-paced agile team is to know if we have failures to fix or analyse. If the tests are passing the result details will not change what happens. The details of the tests are only interesting as to what we will test. That occurs in planning the tests.
That is my humble opinion, and in my experience, it is most useful to those who did not design, implement and execute the tests.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Couple of years later, still hitting the issue of missing reports.
There's a good reason to display results in that in jUnit format you see lots of relevant details. Such as execution times. If your tests are passing but going through the testable code but times have increased 3x, I'd like to know.
Also, some platforms include an informal coverage report alongside the test report (eg: in Go going through available local modules, how many tests are there). It's a quick and easy way to tell whether tests were added as expected without going through the whole merge request.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi all!
I got these lines in my pipeline configuration file:
script:
...
- docker exec -i app ./vendor/bin/phpunit --log-junit ./test-reports/junit.xml
artifacts:
- ./test-reports/**/*.xml
Im forcing an error on test expecting a report, but I get this message in teardown step:
Searching for files matching artifact pattern ./test-reports/**/*.xml
...
Finished scanning for test reports. Found 0 test report files.
Where Am I wrong?
Thks...
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hello. I am facing the same behavior. did you solve this?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi guys, return to this old thread.
I am generating a xunit formatted .xml on every pull request.
I can see the "test" tab with the result.
but I can't see it in the "reports" section of the pull request.
Does anyone know why? am i forgetting something?
Build teardown
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hey Manuel, did you ever find an answer to this?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
It seems he is using below and that found report files
artifacts:
- test-reports/*.xml
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
It would be helpful to also show the reports of passing test as well.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi guys, I have this step in gitlab:
gradle-test:
stage: test
script:
- ./gradlew test -PnodeInstall --no-daemon
artifacts:
reports:
junit: build/test-results/test/TEST-*.xml
paths:
- build/test-results/
- build/jacoco/
expire_in: 1 day
how I transform it to bitbucket pipeline???
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Eduardo,
Bitbucket Pipelines automatically scans for test reports after each step of a pipeline.
So if you have test reports output in your build, then it should be parsed and shown in the UI of the build result.
We got some details on how test reports works in Pipelines here: https://confluence.atlassian.com/bitbucket/test-reporting-in-pipelines-939708543.html
Let me know how you go with that. :)
Thanks,
Phil
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
How bitbucket can show the red banner saying "failed" if the result reports errors?
My case shows that the build can still be deployed even though some tests are failed
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
@Jeannie wang the state of your step (in this case successful), is based on the exit code of your script. In this case, the command you're using to run your tests is still returning a successful exit code, even though you have a failing test.
Bitbucket Pipelines only looks for the presence of failed tests in the xUnit output of your tests.
You'll need to investigate why you test command is returning itself as successful. Perhaps you have a flag you need to enable/disable to avoid ignoring certain test results?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi guys.. from https://confluence.atlassian.com/bitbucket/test-reporting-in-pipelines-939708543.html, says pipelines only support XML xUnit-compatible reports... what if i have an ExtentReport that spits out a TestAutomationReport.html at the end of every test run (and/or attaches a screenshot when a bdd step fails)... how do I integrate this in Bitbucket pipelines and actually have a Test Report tab or whatever that business users can jump into to see my colorful dashboard ?? :)
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hey Jackie,
I believe the reports should be strictly in XML format.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
I don't know If I 'm missing sth here but phpunit can fail on deprecation notices(if not configured otherwise) but the tests will pass. As a result you got a failed pipeline with successful tests. Well, I don't know if it is an uncaught use case by your side, maybe it is worth mentioning it.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hey guys,
after watching your webinar last week I started using Pipelines on our PHP API.
I was reading this thread carefully but still cant see a way to access the error reports created see:
I know there is a way upload this to S3 but I cant find the documentation anymore.
Wouldn't it be the easiest way to attach this report to the email you send anyway on failure:
Hope you can help me and others here.
I already checked: https://confluence.atlassian.com/bitbucket/test-reporting-in-pipelines-939708543
Cheers from Fishburners Sydney :)
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thks for the quick answer and sorry for misplacing my original reply. I misunderstood the interface ^^
Anyway, failing tests failing the pipeline works like it's supposed to.
In case of full success, I would see a green bar in the log output and/or interface on the left (equivalent to the big red one on failure)
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
No problem. I'll pass on your feedback to the rest of the team. :)
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thks !
Also, I don't have the test details on the left, close to the link to see the build configuration as per your screenshots.
Maybe I could have these in there too when they succeed ?
Anyway, thks for the follow up
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thanks @Philip HodderLooks like my test reports are being captured in the build teardown, but I don't see anything in the UI to indicate failed or passing tests.
Found 22 test report files.
All test reports aggregated into 44 test cases
Uploading test results
Finished uploading test results.
When you say "should be showing on the step", is that displayed in the log output of the individual build or the main pipelines screen that shows all builds?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Correction. I do see failing tests, but it's not displayed in the same way as the screenshot you posted.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Sorry, I just showed a subset of the page. Here's the whole result:
The same view as you're looking at.
Thanks,
Phil
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thanks Phil. I'm not seeing the same thing. Could there be some difference because it's an academic team account? (See below)
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Using an academic account is unlikely to be the issue here. It looks like you just have your log view expanded. Do you have an icon that looks like this on the top right of your screen?
Larger screenshot here. Look at the top right (next to 'Download raw'):
If so, when you click it you should see what I'm seeing.
If not, can you let me know what Browser + Version + OS you're using?
Thanks,
Phil
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Oh wow. I feel kind of stupid. I see it now, but I've got to say that it's less than intuitive. Maybe a job for the UX team??
Thanks for your help.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
No worries. I'll pass on the feedback. :)
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Is there any permission needed to get setup before you can view test reports? base from the log it already created the report but I'm not sure why I wasn't able to access it or maybe it's just permission issue or I'm doing it the wrong way.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hello,
I cannot see the test reports too. I'm on Windows 10, with both Chrome & Firefox (latest versions).
I tried with 1 or 2 steps, but still there is no dropdown next to "Logs" to switch to test reports. The tests should be visible since half of them fail. The screen is almost the same as for the previous post (except it is not the same project).
How can I view the test reports ?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi,
What do you see in the logs for the Build Teardown command? Are any tests being discovered?
Which directory are the tests being output to at the moment? We only check for reports in certain directories. Have a look at the documentation here to double check: https://confluence.atlassian.com/bitbucket/test-reporting-in-pipelines-939708543.html
Thanks,
Phil
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi,
I output the tests in the "test-results" directory, and they are found in teardown, but not shown even if half of them have failed... See the image below:
Thanks,
TL
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi,
The Test Result view will be shown by default if you have failed tests + failed pipeline.
However, it looks like a bug that you can't navigate to the tests even though you have failures (but a green build). I'll open an internal ticket to fix this.
Thanks,
Phil
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi @Philip Hodder
I can now see Test Result view but it doesn't show anything on my end
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Looks like the XML isn't being parsed correctly. What build tool are you using? Are you able to show me the XML that is being output?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
@Philip Hodder
I'm using codeception for automation test. Here's an example of the xml file result
<?xml version="1.0" encoding="UTF-8"?>
<testsuites>
<testsuite name="acceptance (chrome-qa)" tests="13" assertions="83" failures="2" errors="0" time="203.458746">
<testcase file="/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php" name="loadLoginPage" class="WinePageCest" feature="load login page" assertions="0" time="9.928986"/>
<testcase file="/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php" name="loginUser" class="WinePageCest" feature="login user" assertions="0" time="7.722183"/>
<testcase file="/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php" name="redirectToMyCollectionsWines" class="WinePageCest" feature="redirect to my collections wines" assertions="0" time="7.060703"/>
<testcase file="/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php" name="verifyWinesPage" class="WinePageCest" feature="verify wines page" assertions="18" time="11.594282"/>
<testcase file="/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php" name="verifyAddWineModal" class="WinePageCest" feature="verify add wine modal" assertions="12" time="19.244257"/>
<testcase file="/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php" name="verifyCellar" class="WinePageCest" feature="verify cellar" assertions="1" time="8.617440"/>
<testcase file="/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php" name="verifyLocation" class="WinePageCest" feature="verify location" assertions="1" time="8.631113"/>
<testcase file="/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php" name="verifyBin" class="WinePageCest" feature="verify bin" assertions="1" time="8.732308"/>
<testcase file="/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php" name="verifyBrowseWinesDetailsCatalogs" class="WinePageCest" feature="verify browse wines details catalogs" assertions="24" time="35.186749">
<failure type="Codeception\Exception\ConditionalAssertionFailed">WinePageCest: Verify browse wines details catalogs
Failed asserting that on page /wines/browse/1789818
--> Wines
Add Asset
WINES SUMMARYWINES COLLECTION131BROWSE WINESDRINK SOONLOCATIONS
2013 "Carlo V" il Rosso dell' Imperatore Merlot and Cabernet Venezie IGT
RED
2016 ”Podere Casa Guidi” di Giuseppe Leonelli Barbera Frizzante Emilia IGT
RED - SPARKLING
N.V. ”Podere Casa Guidi” di Giuseppe Leonell
[Content too long to display. See complete response in '/Users/lovie/Documents/Projects/web/tests/tests/_output/raw/' directory]
--> contains "vintage1".
/Users/lovie/Documents/Projects/web/tests/tests/_support/_generated/AcceptanceTesterActions.php:858
/Users/lovie/Documents/Projects/web/tests/tests/_support/Page/WinePage.php:230
/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php:89
</failure>
</testcase>
<testcase file="/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php" name="addWine" class="WinePageCest" feature="add wine" assertions="1" time="41.462510"/>
<testcase file="/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php" name="browseWinesFromInventory" class="WinePageCest" feature="browse wines from inventory" assertions="1" time="16.380560"/>
<testcase file="/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php" name="verifyBrowseWinesDetailsInventory" class="WinePageCest" feature="verify browse wines details inventory" assertions="24" time="23.722190">
<failure type="Codeception\Exception\ConditionalAssertionFailed">WinePageCest: Verify browse wines details inventory
Failed asserting that on page /wines/collection/1772564
--> Wines
Add Asset
WINES SUMMARYWINES COLLECTION132BROWSE WINESDRINK SOONLOCATIONS
1876 ”Podere Casa Guidi” di Giuseppe Leonelli Vino Nouvo Emilia IGT
RED
1999 3 Bridges Sémillon Golden Mist
WHITE - SWEET/DESSERT
1999 Kevin Cabrera Grape Producer
RED
1999 Montecalvi Alta Valle della Greve IGT
RED
[Content too long to display. See complete response in '/Users/lovie/Documents/Projects/web/tests/tests/_output/raw/' directory]
--> contains "vintage1".
/Users/lovie/Documents/Projects/web/tests/tests/_support/_generated/AcceptanceTesterActions.php:858
/Users/lovie/Documents/Projects/web/tests/tests/_support/Page/WinePage.php:230
/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php:116
</failure>
</testcase>
<testcase file="/Users/lovie/Documents/Projects/web/tests/tests/acceptance/Sanity/004-WinePageCest.php" name="logoutUser" class="WinePageCest" feature="logout user" assertions="0" time="5.175464"/>
</testsuite>
</testsuites>
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
@Philip Hodder I have the exact same problem as @[deleted]. I can only see the "m/n test failures", but I do not see the individual failed cases.
The reports are generated by eslint. Neither eslint -f junit, or eslint -f tap | tap-xunit produces files that file show individual failure reports on the test results page.
eslint -f junit example
<?xml version="1.0" encoding="utf-8"?>
<testsuites>
<testsuite package="org.eslint" time="0" tests="2" errors="2" name="/home/daniel/Documents/flying_sheep/code/sheepcore-boilerplate/src/js/index.js">
<testcase time="0" name="org.eslint.no-mixed-operators"><failure message="Unexpected mix of '&&' and '||'."><![CDATA[line 20, col 16, Error - Unexpected mix of '&&' and '||'. (no-mixed-operators)]]></failure></testcase>
<testcase time="0" name="org.eslint.no-mixed-operators"><failure message="Unexpected mix of '&&' and '||'."><![CDATA[line 20, col 25, Error - Unexpected mix of '&&' and '||'. (no-mixed-operators)]]></failure></testcase>
</testsuite>
<testsuite package="org.eslint" time="0" tests="2" errors="2" name="/home/daniel/Documents/flying_sheep/code/sheepcore-boilerplate/src/js/state/ThreeExample.js">
<testcase time="0" name="org.eslint.padded-blocks"><failure message="Block must not be padded by blank lines."><![CDATA[line 15, col 1, Error - Block must not be padded by blank lines. (padded-blocks)]]></failure></testcase>
<testcase time="0" name="org.eslint.indent"><failure message="Expected indentation of 1 tab but found 0."><![CDATA[line 15, col 1, Error - Expected indentation of 1 tab but found 0. (indent)]]></failure></testcase>
</testsuite>
</testsuites>
eslint -f tap | tap-xunit example
<?xml version="1.0"?>
<testsuites>
<testsuite tests="2" failures="2" errors="0" name="Default">
<testcase name="#1 /home/daniel/Documents/flying_sheep/code/sheepcore-boilerplate/src/js/index.js">
<failure>
---
message: Unexpected mix of '&&' and '||'.
severity: error
data: {"line":20,"column":16,"ruleId":"no-mixed-operators"}
messages: [{"message":"Unexpected mix of '&&' and '||'.","severity":"error","data":{"line":20,"column":25,"ruleId":"no-mixed-operators"}}]
...
</failure>
</testcase>
<testcase name="#2 /home/daniel/Documents/flying_sheep/code/sheepcore-boilerplate/src/js/state/ThreeExample.js">
<failure>
---
message: Block must not be padded by blank lines.
severity: error
data: {"line":15,"column":1,"ruleId":"padded-blocks"}
messages: [{"message":"Expected indentation of 1 tab but found 0.","severity":"error","data":{"line":15,"column":1,"ruleId":"indent"}}]
...
</failure>
</testcase>
</testsuite>
</testsuites>
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
I also am seeing the tests results being collected only in the build teardown.
If there are no errors I don't see anything else in the web interface. Maybe I need some setup. It does says that the test results from junit (Android build) are found and uploaded.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi,
At the moment we're only showing full results of failed tests. Can you check that you see them when you have failed tests?
Is there anything in particular you'd be interested in seeing when all your tests have passed?
Thanks,
Phil
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Nothing immediate. I'll look into that now.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Daniel,
Can you check to see if this is still an issue for you?
We've fixed some bugs in our parser that appear to make your XML parse properly now. I've checked the previously submitted XML files and they all now parse successfully in our parser.
Thanks,
Phil
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Excellent!
Let me know how it goes.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Phil,
I tested both formats again, eslint's JUinit formatting and eslint's tap output piped through tap-xunit. Unfortunately, I still just have the x/x failures display, but no list of failures.
Is your updated parser rolled out to everybody, or might I just need to wait some more?
Thank you,
Daniel
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Daniel,
I've opened an internal bug report. As I get the same issue when I use your XML.
Thanks,
Phil
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi.
I have an issue were Maven/BitBucket are reporting 3 times the amount of tests that I actually have. This is because the test report scanner is getting the information from three different directories. How can I configure the pipeline so only one report can be read by the Test Report Scanner?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.