You're on your way to the next level! Join the Kudos program to earn points and save your progress.
Level 1: Seed
25 / 150 points
1 badge earned
Challenges come and go, but your rewards stay with you. Do more to earn more!
What goes around comes around! Share the love by gifting kudos to your peers.
Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!
Join now to unlock these features and more
I'm trying to add a deployment step to bitbucket pipelines that works successfully when I simulate locally but fails when executing the steps in bitbucket.
I have created a simple docker image that contains docker and compose-cli as per
When I run my "build-env" docker image locally as per bitbuckets debug instructions https://support.atlassian.com/bitbucket-cloud/docs/debug-pipelines-locally-with-docker/ - everything works as expected and I can connect to the AWS ECS docker context and use the "docker compose" commands
However, when I push the changes to bitbucket and execute the build step in bitbucket pipelines the build fails with "compose" command not found.
I have inspected env vars and double checked the correct docker CLI is installed and ensured the correct image is being pulled.
Either the ecs ccontext creation or connection is failing or the wrong docker binary is being called? Or something else i'm not aware of.
The pipeline yaml
image: build-env:latest definitions: steps: - step: &deploy name: Deploy this build deployment: test - docker context create ecs --from-env deploy - docker context use deploy - docker version - docker compose ps
The dockerfile for the build-env
FROM ubuntu:20.04 RUN apt-get update RUN apt-get install -y python3-pip python3-dev \ && cd /usr/local/bin \ && ln -s /usr/bin/python3 python \ && pip3 install --upgrade pip awscli RUN apt-get install -y curl docker.io RUN curl -L https://raw.githubusercontent.com/docker/compose-cli/main/scripts/install/install_linux.sh | sh CMD ["python3"]
The bitbucket output
docker context create ecs --from-env deploy<1s + docker context create ecs --from-env deploy Successfully created ecs context "deploy" docker context use deploy<1s + docker context use deploy deploy docker version<1s + docker version Cannot connect to the Docker daemon at tcp://localhost:2375. Is the docker daemon running? Client: Cloud integration: 1.0.17 Version: 20.10.7 API version: 1.41 Go version: go1.13.8 Git commit: 20.10.7-0ubuntu1~20.04.1 Built: Wed Aug 4 22:52:25 2021 OS/Arch: linux/amd64 Context: default Experimental: true docker compose ps<1s + docker compose ps **docker: 'compose' is not a docker command.** See 'docker --help' Build teardown<1s
Very strange that it works locally but fails on bitbuckets servers... Does anyone know where I'm going wrong???
The end goal is to use "docker compose up" within the AWS context to trigger the ecs cluster updates. It works perfectly everywhere except for inside bitbuckets environment.
Thanks in advance....
I have managed to narrow the issue down. Seems the failure is occurring at "docker context use deploy"
The context is being created and I can inspect and list it, but attempting to use it always fails as seen in the bitbucket pipeline output below.
docker context use deploy<1s
+ docker context use deploy
docker context inspect<1s
+ docker context inspect
"MetadataPath": "\u003cIN MEMORY\u003e",
"TLSPath": "\u003cIN MEMORY\u003e"
docker context ls<1s
+ docker context ls
NAME TYPE DESCRIPTION DOCKER ENDPOINT KUBERNETES ENDPOINT ORCHESTRATOR
default * moby Current DOCKER_HOST based configuration tcp://localhost:2375 swarm
deploy ecs credentials read from environment
I figured it out...
Or more like it, found the answer in this github issue
if DOCKER_HOST environment variable is set, we ignore the current context setting.
Bitbucket does set this environment var to the default context
Once I added
the context switch succeeded and docker compose worked as expected.