Come for the products,
stay for the community

The Atlassian Community can help you and your team get more value out of Atlassian products and practices.

Atlassian Community about banner
Community Members
Community Events
Community Groups

Bitbucket pipelines: how can I map a volume to docker to expose test results?


I'm attempting to expose test results for automated test reporting of a nodejs app build into a docker. Here's the yml

  docker: true

    - step:
          - export IMAGE_NAME=$DOCKER_HUB_USERNAME/currency-updater:$BITBUCKET_COMMIT
          - docker build -t $IMAGE_NAME .
          - docker login --username $DOCKER_HUB_USERNAME --password $DOCKER_HUB_PASSWORD
          - docker push $IMAGE_NAME
    - step:
          - export IMAGE_NAME=$DOCKER_HUB_USERNAME/currency-updater:$BITBUCKET_COMMIT
          - docker login --username $DOCKER_HUB_USERNAME --password $DOCKER_HUB_PASSWORD
          - docker run -v="$BITBUCKET_CLONE_DIR/test-results:/app/test-results" -w="./app" $IMAGE_NAME run-script test-junit

When step 2 executes, it complains that it has some permissions limitations

docker run -v="$BITBUCKET_CLONE_DIR/test-results:/app/test-results" -w="./app" $IMAGE_NAME run-script test-junit
docker: Error response from daemon: authorization denied by plugin pipelines: Command not supported.
See 'docker run --help'.

even if the volume is located inside clone dir as requested by guidelines. If I run the same commands locally (with the needed adaptations) it obviously runs.

I incorrectly posted a discussion while I wanted to post a question. Could admin please remove the discussion and mark this one as not SPAM?

4 answers

What if I need to share composer cache from docker container?
Pipelines can cache composer files from itself, but if I user docker, I should create volumes mapping
So I can't use normal bitbucket pipelines cache ( like


  No other way instead of using custom cache and putting it in subfolder, like this?

composercache: $BITBUCKET_CLONE_DIR/composer-cache
0 votes


Does the "test-results" directory already exist when you use the "docker run" command?

Docker run is unable to create directories in Bitbucket Pipelines as that requires some escalated privileges we cannot expose for security purposes. (Whereas locally it is able to do so without much hassle.)

To resolve this, you should be able to just do the following:

- mkdir $BITBUCKET_CLONE_DIR/test-results
- docker run -v="$BITBUCKET_CLONE_DIR/test-results:/app/test-results" -w="./app" $IMAGE_NAME run-script test-junit

Does that work for you?

@Philip Hodder

Hi Philip,


Thank you for your support.

Unfortunately, creating directory does not solve the problem.

- mkdir -p $BITBUCKET_CLONE_DIR/test-results
- docker run -name mysql -v="$BITBUCKET_CLONE_DIR/test-results:/app/test-results" -d -p 3327:3306 mysql:5.7.18

 Using this script I receive this error :

 + docker run --name mysql -v="$BITBUCKET_CLONE_DIR/test-results:/app/test-results" -d -p 3327:3306 mysql:5.7.18docker: 
Error response from daemon: authorization denied by plugin pipelines: Command not supported.


Could you please advise how to solve it?


Thank you in advance!


Best Regards,


Hi @DmytroS,

I had a quick chat with my teammates. Right now you can *only* mount the $BITBUCKET_CLONE_DIR. None of the subdirectories.

So instead you will need to run something like this:

- docker run -name mysql -v="$BITBUCKET_CLONE_DIR:/app" -d -p 3327:3306 mysql:5.7.18

Let me know if that works.

I've opened an internal issue for allowing the subdirectories as well.



@Philip Hodder

Thank you for your answer!

Yes, if you mount only $BITBUCKET_CLONE_DIR it works.

Is it possible to follow the progress on "subdirectory volume" ticket?


Thank you.

Have a nice day!


Best Regards,


Hi @DmytroS,

Sure! I've opened a ticket you can follow:



Hi @DmytroS,

We've implemented subdirectory support now. It should work as you initially expected now. :)



Hi@Philip Hodder

Thanks a lot!

Now it works:)

Have a great day!

Best Regards,


Like Oleg Borovyk likes this

I have the same issue:

docker: Run command: docker run -v /opt/atlassian/pipelines/agent/build/packer-tmp/packer-docker516303980:/packer-files -d -i -t alpine:3.6 /bin/sh==> 
docker: Error running container: Docker exited with a non-zero exit status.
docker: Stderr: docker: Error response from daemon: authorization denied by plugin pipelines: Command not supported.

Can you verify that the: "/packer-tmp/packer-docker516303980" directory exists before you execute the docker run command. 

See my other answer below for details on this if the directory does not exist.

Thanks @Philip Hodder

Unfortunately, with Packer, I can only create the packer-tmp directory, the other part is generated automatically by the tool and can't be altered.

I'm afraid I know the answer, but do you plan to change these restrictions anytime soon?

Is the generation of the tool done in the docker container? Or in your regular build container? If it's the build container you should be good. Docker container then not so good.

As far as I know there's no immediate plans to remove the current restrictions.

I'll raise this with the PMs, just so they know about the frequency of this being an issue.

The only real workaround to getting additional auth on your docker daemon right now is to run your own EC2 instance (or equivalent virtual compute from another cloud provider) with a regular docker daemon, and hook up Pipelines to that daemon with docker-machine ( 


We have gotten around the problem by replacing the Packer tool with a regular Dockerfile which helped us to specify the actual paths as we'd like.

Hi@Rafal Janicki ,

Just to follow up. We had an issue where we weren't supporting mounting subdirectories of BITBUCKET_CLONE_DIR. You should now be able to mount subdirectories of BITBUCKET_CLONE_DIR.



This works half the time lol.

I tried using this feature today, and my pipeline will work the first time, and not work the next time. As ridiculous as that sounds, you have to see my pipeline logs to believe it. I can't believe it myself.

Firstly, why does bitbucket pipeline allow us to create named volumes, but to use said volumes in the pipeline, we are presented with meaningless errors?

0 votes
Monique vdB Community Manager Dec 11, 2017

@oxyhouse this is marked not spam for you. Sorry about that.

Suggest an answer

Log in or Sign up to answer

Atlassian Community Events