Run Pipeline with docker on and also additional service

I'd like to build a custom Docker image as a step in my Pipeline, like described here:

https://confluence.atlassian.com/bitbucket/run-docker-commands-in-bitbucket-pipelines-879254331.html

Then, I'd like to run tests inside that require database access. Since I can't use `docker run` in Pipelines after the image is built, I figured I'd run the tests as a last build step, in the Dockerfile.

Since the tests need to talk to Postgres, I also created a postgres service, as described here:

https://confluence.atlassian.com/bitbucket/use-services-and-databases-in-bitbucket-pipelines-874786688.html

However, to my dissapointment, this doesn't work. The tests fail because the connection is refused. I've checked all ip/port settings, everything matches, but the service is probably not started, or not accessible from the build process.

Any suggestions how to solve this? Thanks!

1 answer

0 vote

The way we envisioned these pieces working together was that you would compile/test your code within the pipeline, then package and publish after testing. Since you can specify the same image base from your dockerfile to execute your pipeline within.

would a workflow like below work for you?

make # compile/install dependencies
make test # run tests
docker login -u $DOCKER_USER -p $DOCKER_PASSWORD
docker build -t $DOCKER_IMAGE_TAG .
docker push $DOCKER_IMAGE_TAG

Thanks for your answer Sebastian, but I'm afraid it won't cut it. My goal is to run the tests inside Docker too.

My "production" setup consists of a docker-compose.yml file that defines a couple of Docker services, but for the testing only two of them are important:

  • the actuall application (code)
  • database (PostgreSQL)

Now, the application Docker image is custom built from my source code in the repo while the database image is an official Postgres image.

In order to run the tests, I need Pipelines to provide a feature similar to docker-compose: define more docker containers that should be executed and allow them to talk to each other (set up proper networking). This way I'll be able to run the tests in an evironment that exactly matches my production. And that's the whole point of Pipelines right?

Suggest an answer

Log in or Sign up to answer
Atlassian Community Anniversary

Happy Anniversary, Atlassian Community!

This community is celebrating its one-year anniversary and Atlassian co-founder Mike Cannon-Brookes has all the feels.

Read more
Community showcase
Piotr Plewa
Published Dec 27, 2017 in Bitbucket

Recipe: Deploying AWS Lambda functions with Bitbucket Pipelines

Bitbucket Pipelines helps me manage and automate a number of serverless deployments to AWS Lambda and this is how I do it. I'm building Node.js Lambda functions using node-lambda&nbsp...

1,764 views 1 5
Read article

Atlassian User Groups

Connect with like-minded Atlassian users at free events near you!

Find a group

Connect with like-minded Atlassian users at free events near you!

Find my local user group

Unfortunately there are no AUG chapters near you at the moment.

Start an AUG

You're one step closer to meeting fellow Atlassian users at your local meet up. Learn more about AUGs

Groups near you