Deploying to AWS ECS

Eric Wein September 30, 2019

I'm in the process of setting up a CD pipeline to deploy my Bitbucket repo to AWS ECS (fargate). I haven't been able to find many up to date tutorials on how to do this. I found the documentation for "How to deploy to ECS using Pipes" (https://confluence.atlassian.com/bitbucket/deploy-to-amazon-ecs-892623902.html) however its a bit unclear. I have added the example script:

- pipe: atlassian/aws-ecs-deploy:1.0.0
  variables:
    AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
    AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
    AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
    CLUSTER_NAME: 'aws-ecs-deploy-example'
    SERVICE_NAME: 'aws-ecs-deploy-example-service'
    TASK_DEFINITION: 'taskDefinition.json'

however the build fails saying it can't find the task definition json in the repo. Could I get some more defined steps on how to get this working with AWS ECS (fargate, ECR et..)

Thanks

2 comments

Alexander Zhukov
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
October 2, 2019

Hi @Eric Wein ! You have to create the task definition file in your repository and commit it. Use the name of this file as a TASK_DEFINITION. Here is a working example of deploying a simple application https://bitbucket.org/bitbucketpipelines/example-aws-ecs-deploy/src/master/.

Eric Wein October 2, 2019

Great thanks. I noticed in the example you are using docker hub. I'm using AWS ECR so do I need to push the image there instead of docker hub? 

Alexander Zhukov
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
October 2, 2019

This doesn't matter, you can use dockerhub, AWS ECR or some other container registry of your choice. Also, there is a pipe for pushing images to AWS ECR as well https://bitbucket.org/atlassian/aws-ecr-push-image/

Eric Wein October 2, 2019

Ok perfect, do I also need a Dockerfile to boot the Node service after it is deployed or can I do it in the bitbucket-pipelines.yml under a script? Also for the ecr-push pipe what should be the value for IMAGE_NAME? Is that the ECR repo url? I'm getting a `Docker push error: name unknown` error when I do the following:


- step:
name: Build and push image to ECR
services:
- docker
script:
# build the image
- docker build -t $AWS_ECR_REPO_URL/my-image-name .

# use the pipe to push the image to AWS ECR
- pipe: atlassian/aws-ecr-push-image:0.1.3
variables:
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
IMAGE_NAME: $AWS_ECR_REPO_URL/my-image-name

 

Eric Wein October 2, 2019

I got it working, I decided to not use the pipe and just use the manual docker commands.

Like masernet-atlassian likes this
Diônitas Mendes dos Santos October 4, 2019

@Alexander Zhukov,  do you have any idea of the reason why the erro bellow happens?

Docker push error: name unknown: The repository with name 'repositoryy_name/image_name' does not exist in the registry with id 'repository_id'

I am using the same setup of Eric.

Alexander Zhukov
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
October 4, 2019

@Diônitas Mendes dos Santos you don't need to add the 

$AWS_ECR_REPO_URL

 before the /my-image-name. The pipe adds this URL for you, so when you add it to the pipe parameter, it basically becomes

<AWS_ECR_REPO_URL>/<AWS_ECR_REPO_URL>/my-image-name

Docker interprets the second part (after the first slash) as an image name, instead of repository/image_name, that's why there is an error.  

 

Here is an example:

- pipe: atlassian/aws-ecr-push-image:0.1.3
variables:
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
IMAGE_NAME: my-image-name
Mazen El-Kashef December 23, 2019

@Alexander Zhukov I was wondering how would this work? "image": "${IMAGE_NAME}", in my definition I need to add environment variables and I definitely don't want them to be in my repository, at the same time "image": "${IMAGE_NAME}" resolves resolves to a string, and would that be evaluated at the pipeline or at AWS (it doesn't make sense to be evaluated in AWS but just wondering)?

Alexander Zhukov
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
December 23, 2019

@Mazen El-Kashef you have to replace it with the real image name before executing the pipe. For example, you can use the envsubst tool. Here is an example 

envsubst < task-definition-template.json > task-definition.json
This will replace the "${IMAGE_NAME}" value in your task definition with the corresponding environment variable. Here is a link to the complete example: https://bitbucket.org/bitbucketpipelines/example-aws-ecs-deploy/src/master/bitbucket-pipelines.yml
Like Mazen El-Kashef likes this
VictorUsoro April 13, 2020

The build fails at this step, it says envsubst not found

Like Mark Milan likes this
Omri April 14, 2020

@Alexander Zhukov 

Hi, I am using this example and it all works for me.

However, I need to fill out the port mapping (e.g. 8080)

      "portMappings": [        {          "hostPort": "${PORT}",          "protocol""tcp",          "containerPort": "${PORT}"        }      ],

It fails because PORT should be a number and not a string, removing the quotes makes it invalid JSON file.

What do you suggest?

Mark Milan May 6, 2020

@VictorUsoro 

Try adding on the top of your pipeline

image: 

name: atlassian/default-image:2
Like VictorUsoro likes this
Eric Wein October 5, 2019

@Alexander Zhukov I'm getting "Container 'docker' exceeded memory limit." errors when trying to run my react app in production mode. I have tried to increase the memory for the docker build step by setting size: 2x but the error is still occurring. My app does not have any super large dependencies just the pretty standard react dependencies. Here is the step that is breaking.

- step:
name: Build Docker Image
services:
- docker
image: atlassian/pipelines-awscli
size: 2x # Double resources available for this step.
script:
- echo $(aws ecr get-login --no-include-email --region us-west-2) > login.sh

- sh login.sh

- docker build -t my-image-name --build-arg NPM_TOKEN=${NPM_TOKEN} .

- docker tag my-image-name:latest 231231231.dkr.ecr.us-west-2.amazonaws.com/my-image-name:latest

- docker push 231231231.dkr.ecr.us-west-2.amazonaws.com/my-image-name:latest

 

Like Adan J. Suarez likes this
Ignacio Mendizabal July 6, 2020

Any solution to this? I'm experiencing the same issue... 

Comment

Log in or Sign up to comment
TAGS
AUG Leaders

Atlassian Community Events