Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Pipelines: How to connect to a (Postgres) service from within another Docker container

Jozef Knaperek May 30, 2019

In my pipelines I have defined an external service that runs Postgres:

definitions:
  services:
    db:
      image: postgres
      environment:
        POSTGRES_USER: 'myuser'
        POSTGRES_PASSWORD: 'mypassword'

Now, in my pipeline I need to build my web app from a custom Dockerfile and then test it with the DB. So I define my pipeline step as follows:

pipelines:
default:
- step:
script:
- docker build -t app .
- docker run app ./run_tests.sh
- services:
- db
- docker

I can see that Postgres is listening on port 5432 on the main host (executor), but I can't seem to find a way to connect to it from within my app Docker container.

I also tried to run it with `--network="host"` but this doesn't work due to restrictions in place.

Any ideas? In GitLab this works perfectly. Thanks.

1 answer

0 votes
Steven Vaccarella
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
July 1, 2019

Hi Jozef,

Try adding a host-to-IP mapping in your "docker run" command, like this:

docker run --add-host host.docker.internal:$BITBUCKET_DOCKER_HOST_INTERNAL app ./run_tests.sh

Then you should be able to connect to the service using host.docker.internal:5432 within the container.

The following blog post explains why this is now needed:

https://community.atlassian.com/t5/Bitbucket-articles/Changes-to-make-your-containers-more-secure-on-Bitbucket/ba-p/998464

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events