Come for the products,
stay for the community

The Atlassian Community can help you and your team get more value out of Atlassian products and practices.

Atlassian Community about banner
4,293,766
Community Members
 
Community Events
165
Community Groups

Deploying to AWS ECS

I'm in the process of setting up a CD pipeline to deploy my Bitbucket repo to AWS ECS (fargate). I haven't been able to find many up to date tutorials on how to do this. I found the documentation for "How to deploy to ECS using Pipes" (https://confluence.atlassian.com/bitbucket/deploy-to-amazon-ecs-892623902.html) however its a bit unclear. I have added the example script:

- pipe: atlassian/aws-ecs-deploy:1.0.0
  variables:
    AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
    AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
    AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
    CLUSTER_NAME: 'aws-ecs-deploy-example'
    SERVICE_NAME: 'aws-ecs-deploy-example-service'
    TASK_DEFINITION: 'taskDefinition.json'

however the build fails saying it can't find the task definition json in the repo. Could I get some more defined steps on how to get this working with AWS ECS (fargate, ECR et..)

Thanks

2 comments

Hi @Eric Wein ! You have to create the task definition file in your repository and commit it. Use the name of this file as a TASK_DEFINITION. Here is a working example of deploying a simple application https://bitbucket.org/bitbucketpipelines/example-aws-ecs-deploy/src/master/.

Great thanks. I noticed in the example you are using docker hub. I'm using AWS ECR so do I need to push the image there instead of docker hub? 

This doesn't matter, you can use dockerhub, AWS ECR or some other container registry of your choice. Also, there is a pipe for pushing images to AWS ECR as well https://bitbucket.org/atlassian/aws-ecr-push-image/

Ok perfect, do I also need a Dockerfile to boot the Node service after it is deployed or can I do it in the bitbucket-pipelines.yml under a script? Also for the ecr-push pipe what should be the value for IMAGE_NAME? Is that the ECR repo url? I'm getting a `Docker push error: name unknown` error when I do the following:


- step:
name: Build and push image to ECR
services:
- docker
script:
# build the image
- docker build -t $AWS_ECR_REPO_URL/my-image-name .

# use the pipe to push the image to AWS ECR
- pipe: atlassian/aws-ecr-push-image:0.1.3
variables:
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
IMAGE_NAME: $AWS_ECR_REPO_URL/my-image-name

 

I got it working, I decided to not use the pipe and just use the manual docker commands.

Like masernet-atlassian likes this

@Alexander Zhukov,  do you have any idea of the reason why the erro bellow happens?

Docker push error: name unknown: The repository with name 'repositoryy_name/image_name' does not exist in the registry with id 'repository_id'

I am using the same setup of Eric.

@Diônitas Mendes dos Santos you don't need to add the 

$AWS_ECR_REPO_URL

 before the /my-image-name. The pipe adds this URL for you, so when you add it to the pipe parameter, it basically becomes

<AWS_ECR_REPO_URL>/<AWS_ECR_REPO_URL>/my-image-name

Docker interprets the second part (after the first slash) as an image name, instead of repository/image_name, that's why there is an error.  

 

Here is an example:

- pipe: atlassian/aws-ecr-push-image:0.1.3
variables:
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
IMAGE_NAME: my-image-name

@Alexander Zhukov I was wondering how would this work? "image": "${IMAGE_NAME}", in my definition I need to add environment variables and I definitely don't want them to be in my repository, at the same time "image": "${IMAGE_NAME}" resolves resolves to a string, and would that be evaluated at the pipeline or at AWS (it doesn't make sense to be evaluated in AWS but just wondering)?

@Mazen El-Kashef you have to replace it with the real image name before executing the pipe. For example, you can use the envsubst tool. Here is an example 

envsubst < task-definition-template.json > task-definition.json
This will replace the "${IMAGE_NAME}" value in your task definition with the corresponding environment variable. Here is a link to the complete example: https://bitbucket.org/bitbucketpipelines/example-aws-ecs-deploy/src/master/bitbucket-pipelines.yml
Like Mazen El-Kashef likes this

The build fails at this step, it says envsubst not found

Like Mark Milan likes this

@Alexander Zhukov 

Hi, I am using this example and it all works for me.

However, I need to fill out the port mapping (e.g. 8080)

      "portMappings": [        {          "hostPort": "${PORT}",          "protocol""tcp",          "containerPort": "${PORT}"        }      ],

It fails because PORT should be a number and not a string, removing the quotes makes it invalid JSON file.

What do you suggest?

@VictorUsoro 

Try adding on the top of your pipeline

image: 

name: atlassian/default-image:2
Like VictorUsoro likes this

@Alexander Zhukov I'm getting "Container 'docker' exceeded memory limit." errors when trying to run my react app in production mode. I have tried to increase the memory for the docker build step by setting size: 2x but the error is still occurring. My app does not have any super large dependencies just the pretty standard react dependencies. Here is the step that is breaking.

- step:
name: Build Docker Image
services:
- docker
image: atlassian/pipelines-awscli
size: 2x # Double resources available for this step.
script:
- echo $(aws ecr get-login --no-include-email --region us-west-2) > login.sh

- sh login.sh

- docker build -t my-image-name --build-arg NPM_TOKEN=${NPM_TOKEN} .

- docker tag my-image-name:latest 231231231.dkr.ecr.us-west-2.amazonaws.com/my-image-name:latest

- docker push 231231231.dkr.ecr.us-west-2.amazonaws.com/my-image-name:latest

 

Like Adan J. Suarez likes this

Any solution to this? I'm experiencing the same issue... 

Comment

Log in or Sign up to comment
TAGS
Community showcase
Published in Bitbucket

Git push size limits are coming to Bitbucket Cloud starting April 4th, 2022

Beginning on April 4th, we will be implementing push limits. This means that your push cannot be completed if it is over 3.5 GB. If you do attempt to complete a push that is over 3.5 GB, it will fail...

2,112 views 2 9
Read article

Community Events

Connect with like-minded Atlassian users at free events near you!

Find an event

Connect with like-minded Atlassian users at free events near you!

Unfortunately there are no Community Events near you at the moment.

Host an event

You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events

Events near you