Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in
Celebration

Earn badges and make progress

You're on your way to the next level! Join the Kudos program to earn points and save your progress.

Deleted user Avatar
Deleted user

Level 1: Seed

25 / 150 points

Next: Root

Avatar

1 badge earned

Collect

Participate in fun challenges

Challenges come and go, but your rewards stay with you. Do more to earn more!

Challenges
Coins

Gift kudos to your peers

What goes around comes around! Share the love by gifting kudos to your peers.

Recognition
Ribbon

Rise up in the ranks

Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!

Leaderboard

Come for the products,
stay for the community

The Atlassian Community can help you and your team get more value out of Atlassian products and practices.

Atlassian Community about banner
4,456,810
Community Members
 
Community Events
176
Community Groups

Unable to use "docker compose" and AWS ECS docker context in bitbucket pipelines deployment step

Edited

I'm trying to add a deployment step to bitbucket pipelines that works successfully when I simulate locally but fails when executing the steps in bitbucket.

I have created a simple docker image that contains docker and compose-cli as per https://raw.githubusercontent.com/docker/compose-cli/main/scripts/install/install_linux.sh

When I run my "build-env" docker image locally as per bitbuckets debug instructions https://support.atlassian.com/bitbucket-cloud/docs/debug-pipelines-locally-with-docker/ - everything works as expected and I can connect to the AWS ECS docker context and use the "docker compose" commands

However, when I push the changes to bitbucket and execute the build step in bitbucket pipelines the build fails with "compose" command not found.

I have inspected env vars and double checked the correct docker CLI is installed and ensured the correct image is being pulled.

Either the ecs ccontext creation or connection is failing or the wrong docker binary is being called? Or something else i'm not aware of.


The pipeline yaml

image: build-env:latest

definitions:
  steps:
    - step: &deploy
        name: Deploy this build
        deployment: test
          - docker context create ecs --from-env deploy
          - docker context use deploy
          - docker version
          - docker compose ps

The dockerfile for the build-env

FROM ubuntu:20.04

RUN apt-get update

RUN apt-get install -y python3-pip python3-dev \
  && cd /usr/local/bin \
  && ln -s /usr/bin/python3 python \
  && pip3 install --upgrade pip awscli

RUN apt-get install -y curl docker.io

RUN curl -L https://raw.githubusercontent.com/docker/compose-cli/main/scripts/install/install_linux.sh | sh

CMD ["python3"]

The bitbucket output

docker context create ecs --from-env deploy<1s

+ docker context create ecs --from-env deploy
Successfully created ecs context "deploy"
docker context use deploy<1s

+ docker context use deploy
deploy

docker version<1s

+ docker version
Cannot connect to the Docker daemon at tcp://localhost:2375. Is the docker daemon running?
Client:
 Cloud integration: 1.0.17
 Version:           20.10.7
 API version:       1.41
 Go version:        go1.13.8
 Git commit:        20.10.7-0ubuntu1~20.04.1
 Built:             Wed Aug  4 22:52:25 2021
 OS/Arch:           linux/amd64
 Context:           default
 Experimental:      true


docker compose ps<1s

+ docker compose ps
**docker: 'compose' is not a docker command.**


See 'docker --help'
Build teardown<1s

Very strange that it works locally but fails on bitbuckets servers...  Does anyone know where I'm going wrong???

 

The end goal is to use "docker compose up" within the AWS context to trigger the ecs cluster updates. It works perfectly everywhere except for inside bitbuckets environment.

Thanks in advance....

 

**Update**
I have managed to narrow the issue down. Seems the failure is occurring at "docker context use deploy"

The context is being created and I can inspect and list it, but attempting to use it always fails as seen in the bitbucket pipeline output below.

docker context use deploy<1s

+ docker context use deploy
deploy
docker context inspect<1s

+ docker context inspect
[
{
"Name": "default",
"Metadata": {
"StackOrchestrator": "swarm"
},
"Endpoints": {
"docker": {
"Host": "tcp://localhost:2375",
"SkipTLSVerify": false
}
},
"TLSMaterial": {},
"Storage": {
"MetadataPath": "\u003cIN MEMORY\u003e",
"TLSPath": "\u003cIN MEMORY\u003e"
}
}
]
docker context ls<1s

+ docker context ls
NAME TYPE DESCRIPTION DOCKER ENDPOINT KUBERNETES ENDPOINT ORCHESTRATOR
default * moby Current DOCKER_HOST based configuration tcp://localhost:2375 swarm
deploy ecs credentials read from environment



1 answer

1 accepted

2 votes
Answer accepted

I figured it out...

 

Or more like it, found the answer in this github issue

 

https://github.com/docker/cli/issues/1809

 

if DOCKER_HOST environment variable is set, we ignore the current context setting.

 

Bitbucket does set this environment var to the default context

 

Once I added

unset DOCKER_HOST

 

the context switch succeeded and docker compose worked as expected.

Suggest an answer

Log in or Sign up to answer
TAGS

Atlassian Community Events