Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Run Pipeline with docker on and also additional service

Jozef Knaperek May 12, 2017

I'd like to build a custom Docker image as a step in my Pipeline, like described here:

https://confluence.atlassian.com/bitbucket/run-docker-commands-in-bitbucket-pipelines-879254331.html

Then, I'd like to run tests inside that require database access. Since I can't use `docker run` in Pipelines after the image is built, I figured I'd run the tests as a last build step, in the Dockerfile.

Since the tests need to talk to Postgres, I also created a postgres service, as described here:

https://confluence.atlassian.com/bitbucket/use-services-and-databases-in-bitbucket-pipelines-874786688.html

However, to my dissapointment, this doesn't work. The tests fail because the connection is refused. I've checked all ip/port settings, everything matches, but the service is probably not started, or not accessible from the build process.

Any suggestions how to solve this? Thanks!

1 answer

0 votes
SebC
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
May 22, 2017

The way we envisioned these pieces working together was that you would compile/test your code within the pipeline, then package and publish after testing. Since you can specify the same image base from your dockerfile to execute your pipeline within.

would a workflow like below work for you?

make # compile/install dependencies
make test # run tests
docker login -u $DOCKER_USER -p $DOCKER_PASSWORD
docker build -t $DOCKER_IMAGE_TAG .
docker push $DOCKER_IMAGE_TAG
Jozef Knaperek May 23, 2017

Thanks for your answer Sebastian, but I'm afraid it won't cut it. My goal is to run the tests inside Docker too.

My "production" setup consists of a docker-compose.yml file that defines a couple of Docker services, but for the testing only two of them are important:

  • the actuall application (code)
  • database (PostgreSQL)

Now, the application Docker image is custom built from my source code in the repo while the database image is an official Postgres image.

In order to run the tests, I need Pipelines to provide a feature similar to docker-compose: define more docker containers that should be executed and allow them to talk to each other (set up proper networking). This way I'll be able to run the tests in an evironment that exactly matches my production. And that's the whole point of Pipelines right?

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events