Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Next challenges

Recent achievements

Recognition

  • Give kudos
  • My kudos

Leaderboard

  • Global

Trophy case

Kudos (beta program)

Kudos logo

You've been invited into the Kudos (beta program) private group. Chat with others in the program, or give feedback to Atlassian.

View group

It's not the same without you

Join the community to find out what other Atlassian users are discussing, debating and creating.

Atlassian Community Hero Image Collage

Bitbucket pipelines: how can I map a volume to docker to expose test results? Edited

I'm attempting to expose test results for automated test reporting of a nodejs app build into a docker. Here's the yml

options:
  docker: true

pipelines:
  default:
    - step:
        script: 
          - export IMAGE_NAME=$DOCKER_HUB_USERNAME/currency-updater:$BITBUCKET_COMMIT
          - docker build -t $IMAGE_NAME .
          - docker login --username $DOCKER_HUB_USERNAME --password $DOCKER_HUB_PASSWORD
          - docker push $IMAGE_NAME
    - step:
        script:
          - export IMAGE_NAME=$DOCKER_HUB_USERNAME/currency-updater:$BITBUCKET_COMMIT
          - docker login --username $DOCKER_HUB_USERNAME --password $DOCKER_HUB_PASSWORD
          - docker run -v="$BITBUCKET_CLONE_DIR/test-results:/app/test-results" -w="./app" $IMAGE_NAME run-script test-junit


When step 2 executes, it complains that it has some permissions limitations


docker run -v="$BITBUCKET_CLONE_DIR/test-results:/app/test-results" -w="./app" $IMAGE_NAME run-script test-junit
docker: Error response from daemon: authorization denied by plugin pipelines: Command not supported.
See 'docker run --help'.


even if the volume is located inside clone dir as requested by guidelines. If I run the same commands locally (with the needed adaptations) it obviously runs.

I incorrectly posted a discussion while I wanted to post a question. Could admin please remove the discussion and mark this one as not SPAM?

4 answers

Hi
What if I need to share composer cache from docker container?
Pipelines can cache composer files from itself, but if I user docker, I should create volumes mapping
So I can't use normal bitbucket pipelines cache (https://confluence.atlassian.com/bitbucket/caching-dependencies-895552876.html) like

~/.composer/cache:/root/.composer/cache

  No other way instead of using custom cache and putting it in subfolder, like this?

-v="$BITBUCKET_CLONE_DIR/composer-cache:/root/.composer/cache"
definitions:
caches:
composercache: $BITBUCKET_CLONE_DIR/composer-cache
0 votes
Monique vdB Community Manager Dec 11, 2017

@oxyhouse this is marked not spam for you. Sorry about that.

I have the same issue:

docker: Run command: docker run -v /opt/atlassian/pipelines/agent/build/packer-tmp/packer-docker516303980:/packer-files -d -i -t alpine:3.6 /bin/sh==> 
docker: Error running container: Docker exited with a non-zero exit status.
==>
docker: Stderr: docker: Error response from daemon: authorization denied by plugin pipelines: Command not supported.

Can you verify that the: "/packer-tmp/packer-docker516303980" directory exists before you execute the docker run command. 

See my other answer below for details on this if the directory does not exist.

Thanks @Philip Hodder

Unfortunately, with Packer, I can only create the packer-tmp directory, the other part is generated automatically by the tool and can't be altered.

I'm afraid I know the answer, but do you plan to change these restrictions anytime soon?

Is the generation of the tool done in the docker container? Or in your regular build container? If it's the build container you should be good. Docker container then not so good.

As far as I know there's no immediate plans to remove the current restrictions.

I'll raise this with the PMs, just so they know about the frequency of this being an issue.

The only real workaround to getting additional auth on your docker daemon right now is to run your own EC2 instance (or equivalent virtual compute from another cloud provider) with a regular docker daemon, and hook up Pipelines to that daemon with docker-machine (https://docs.docker.com/machine/overview/). 

Thanks.

We have gotten around the problem by replacing the Packer tool with a regular Dockerfile which helped us to specify the actual paths as we'd like.

Hi@Rafal Janicki ,

Just to follow up. We had an issue where we weren't supporting mounting subdirectories of BITBUCKET_CLONE_DIR. You should now be able to mount subdirectories of BITBUCKET_CLONE_DIR.

Thanks,

Phil

0 votes

Hello,

Does the "test-results" directory already exist when you use the "docker run" command?

Docker run is unable to create directories in Bitbucket Pipelines as that requires some escalated privileges we cannot expose for security purposes. (Whereas locally it is able to do so without much hassle.)

To resolve this, you should be able to just do the following:

- mkdir $BITBUCKET_CLONE_DIR/test-results
- docker run -v="$BITBUCKET_CLONE_DIR/test-results:/app/test-results" -w="./app" $IMAGE_NAME run-script test-junit

Does that work for you?

@Philip Hodder

Hi Philip,

 

Thank you for your support.

Unfortunately, creating directory does not solve the problem.

- mkdir -p $BITBUCKET_CLONE_DIR/test-results
- docker run -name mysql -v="$BITBUCKET_CLONE_DIR/test-results:/app/test-results" -d -p 3327:3306 mysql:5.7.18

 Using this script I receive this error :

 + docker run --name mysql -v="$BITBUCKET_CLONE_DIR/test-results:/app/test-results" -d -p 3327:3306 mysql:5.7.18docker: 
Error response from daemon: authorization denied by plugin pipelines: Command not supported.
S

 

Could you please advise how to solve it?

 

Thank you in advance!

 

Best Regards,

Dmytro

Hi @DmytroS,

I had a quick chat with my teammates. Right now you can *only* mount the $BITBUCKET_CLONE_DIR. None of the subdirectories.

So instead you will need to run something like this:

- docker run -name mysql -v="$BITBUCKET_CLONE_DIR:/app" -d -p 3327:3306 mysql:5.7.18

Let me know if that works.

I've opened an internal issue for allowing the subdirectories as well.

Thanks,

Phil

@Philip Hodder

Thank you for your answer!

Yes, if you mount only $BITBUCKET_CLONE_DIR it works.

Is it possible to follow the progress on "subdirectory volume" ticket?

 

Thank you.

Have a nice day!

 

Best Regards,

Dmytro

Hi @DmytroS,

Sure! I've opened a ticket you can follow: https://bitbucket.org/site/master/issues/15508/allow-mounting-subdirectories-of

Thanks,

Phil

Hi @DmytroS,

We've implemented subdirectory support now. It should work as you initially expected now. :)

Thanks,

Phil

Hi@Philip Hodder

Thanks a lot!

Now it works:)

Have a great day!

Best Regards,

Dmytro

Like Oleg Borovyk likes this

Suggest an answer

Log in or Sign up to answer
TAGS
Community showcase
Published in Bitbucket

Powering DevOps with Bitbucket Server & Data Center

Hi everyone, The Cloud team recently announced 12 new DevOps features that help developers ship better code, faster   ! While we’re all excited about the new improvements to Bitbucket ...

1,899 views 0 7
Read article

Community Events

Connect with like-minded Atlassian users at free events near you!

Find an event

Connect with like-minded Atlassian users at free events near you!

Unfortunately there are no Community Events near you at the moment.

Host an event

You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events

Events near you