Trying this question again... for some reason my previous one is gone.
I am trying to deploy to AWS ECR using bitbucket pipeline and everytime I try it, it said: Container 'docker' exceeded memory limit. I've tried all possible combinations of size: 1x and 2x and memory: 1024 to 7128 but it didn't work.
This is my last bitbucket-pipeline.yml:
image: node:12
pipelines: default:
- step: size: 1x
caches:
- node
script:
- npm install
- npm run test:unit
branches:
master:
- step:
size: 2x
caches:
- node
script:
- npm install
- npm run test:unit
# Build the image.
- docker build -t beautiful-frontend .
# use the pipe to push the image to AWS ECR
- pipe: atlassian/aws-ecr-push-image:1.0.2
variables:
# Access keys must to be change.
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
IMAGE_NAME: beautiful-frontend
services:
- docker
definitions:
service:
docker:
# Memory docker-in-docker
memory: 7128
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.