Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Container 'docker' exceeded memory limit.

Adan J. Suarez January 10, 2020

Trying this question again... for some reason my previous one is gone.

I am trying to deploy to AWS ECR using bitbucket pipeline and everytime I try it, it said: Container 'docker' exceeded memory limit. I've tried all possible combinations of size: 1x and 2x and memory: 1024 to 7128 but it didn't work.

 

This is my last bitbucket-pipeline.yml:

 

image: node:12
pipelines: default:
- step: size: 1x
caches:
- node
script:
- npm install
- npm run test:unit
branches:
master:
- step:
size: 2x
caches:
- node
script:
- npm install
- npm run test:unit
# Build the image.
- docker build -t beautiful-frontend .
# use the pipe to push the image to AWS ECR
- pipe: atlassian/aws-ecr-push-image:1.0.2
variables:
# Access keys must to be change.
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
IMAGE_NAME: beautiful-frontend
services:
- docker
definitions:
service:
docker:
# Memory docker-in-docker
memory: 7128

 

1 answer

1 accepted

0 votes
Answer accepted
Antoine Büsch
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
January 16, 2020

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events