Missed Team ’24? Catch up on announcements here.

×
Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Container 'docker' exceeded memory limit.

Tom August 18, 2020

Hi there,

I'm running into issues with trying to push a docker image up to AWS using a pipeline. I've looked at a couple of other similar posts but nothing seems to do the trick for me. My `bitbucket-pipelines.yml` file is as follows:

 

image: node:14.8.0-alpine3.11

options:
docker: true

defintions:
services:
docker:
memory: 6000
step: &build
caches:
-
node
name: Build Branch
script:
-
cp -r ./db/prisma_dev prisma
-
npm install
-
npm run prisma:generate
-
npm run compile
artifacts:
-
dist/**
-
node_modules/**
-
prisma/**

pipelines:
default:
-
step:
script:
-
echo "No build script associated with this branch."

branches:
develop:
-
step: *build
-
step:
size: 2x
services:
-
docker
name: Deploy to Develop
deployment: develop
script:
- export ECS_REGION=eu-west-1
-
export ECS_REPO_URI= XXXX
-
export ENV=develop
-
docker build -t my-image .
-
pipe: atlassian/aws-ecr-push-image:1.1.3
variables:
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
IMAGE_NAME: image
TAGS: 'build-AA${BITBUCKET_BUILD_NUMBER}' 

No matter what I put in for memory seems to be enough to build the image. I've also tried adding `size: 2x` to the options but still no joy. Any help for how I can get passed this issue would be great.

 

0 answers

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events