Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Earn badges and make progress

You're on your way to the next level! Join the Kudos program to earn points and save your progress.

Deleted user Avatar
Deleted user

Level 1: Seed

25 / 150 points

Next: Root


1 badge earned


Participate in fun challenges

Challenges come and go, but your rewards stay with you. Do more to earn more!


Gift kudos to your peers

What goes around comes around! Share the love by gifting kudos to your peers.


Rise up in the ranks

Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!


Come for the products,
stay for the community

The Atlassian Community can help you and your team get more value out of Atlassian products and practices.

Atlassian Community about banner
Community Members
Community Events
Community Groups

Deploying to AWS ECS

I'm in the process of setting up a CD pipeline to deploy my Bitbucket repo to AWS ECS (fargate). I haven't been able to find many up to date tutorials on how to do this. I found the documentation for "How to deploy to ECS using Pipes" ( however its a bit unclear. I have added the example script:

- pipe: atlassian/aws-ecs-deploy:1.0.0
    CLUSTER_NAME: 'aws-ecs-deploy-example'
    SERVICE_NAME: 'aws-ecs-deploy-example-service'
    TASK_DEFINITION: 'taskDefinition.json'

however the build fails saying it can't find the task definition json in the repo. Could I get some more defined steps on how to get this working with AWS ECS (fargate, ECR et..)



Hi @Eric Wein ! You have to create the task definition file in your repository and commit it. Use the name of this file as a TASK_DEFINITION. Here is a working example of deploying a simple application

Great thanks. I noticed in the example you are using docker hub. I'm using AWS ECR so do I need to push the image there instead of docker hub? 

This doesn't matter, you can use dockerhub, AWS ECR or some other container registry of your choice. Also, there is a pipe for pushing images to AWS ECR as well

Ok perfect, do I also need a Dockerfile to boot the Node service after it is deployed or can I do it in the bitbucket-pipelines.yml under a script? Also for the ecr-push pipe what should be the value for IMAGE_NAME? Is that the ECR repo url? I'm getting a `Docker push error: name unknown` error when I do the following:

- step:
name: Build and push image to ECR
- docker
# build the image
- docker build -t $AWS_ECR_REPO_URL/my-image-name .

# use the pipe to push the image to AWS ECR
- pipe: atlassian/aws-ecr-push-image:0.1.3


I got it working, I decided to not use the pipe and just use the manual docker commands.

Like masernet-atlassian likes this

@Alexander Zhukov,  do you have any idea of the reason why the erro bellow happens?

Docker push error: name unknown: The repository with name 'repositoryy_name/image_name' does not exist in the registry with id 'repository_id'

I am using the same setup of Eric.

@Diônitas Mendes dos Santos you don't need to add the 


 before the /my-image-name. The pipe adds this URL for you, so when you add it to the pipe parameter, it basically becomes


Docker interprets the second part (after the first slash) as an image name, instead of repository/image_name, that's why there is an error.  


Here is an example:

- pipe: atlassian/aws-ecr-push-image:0.1.3
IMAGE_NAME: my-image-name

@Alexander Zhukov I was wondering how would this work? "image": "${IMAGE_NAME}", in my definition I need to add environment variables and I definitely don't want them to be in my repository, at the same time "image": "${IMAGE_NAME}" resolves resolves to a string, and would that be evaluated at the pipeline or at AWS (it doesn't make sense to be evaluated in AWS but just wondering)?

@Mazen El-Kashef you have to replace it with the real image name before executing the pipe. For example, you can use the envsubst tool. Here is an example 

envsubst < task-definition-template.json > task-definition.json
This will replace the "${IMAGE_NAME}" value in your task definition with the corresponding environment variable. Here is a link to the complete example:
Like Mazen El-Kashef likes this

The build fails at this step, it says envsubst not found

Like Mark Milan likes this

@Alexander Zhukov 

Hi, I am using this example and it all works for me.

However, I need to fill out the port mapping (e.g. 8080)

      "portMappings": [        {          "hostPort": "${PORT}",          "protocol""tcp",          "containerPort": "${PORT}"        }      ],

It fails because PORT should be a number and not a string, removing the quotes makes it invalid JSON file.

What do you suggest?


Try adding on the top of your pipeline


name: atlassian/default-image:2
Like VictorUsoro likes this

@Alexander Zhukov I'm getting "Container 'docker' exceeded memory limit." errors when trying to run my react app in production mode. I have tried to increase the memory for the docker build step by setting size: 2x but the error is still occurring. My app does not have any super large dependencies just the pretty standard react dependencies. Here is the step that is breaking.

- step:
name: Build Docker Image
- docker
image: atlassian/pipelines-awscli
size: 2x # Double resources available for this step.
- echo $(aws ecr get-login --no-include-email --region us-west-2) >

- sh

- docker build -t my-image-name --build-arg NPM_TOKEN=${NPM_TOKEN} .

- docker tag my-image-name:latest

- docker push


Like Adan J. Suarez likes this

Any solution to this? I'm experiencing the same issue... 


Log in or Sign up to comment

Atlassian Community Events