Trying this question again... for some reason my previous one is gone.
I am trying to deploy to AWS ECR using bitbucket pipeline and everytime I try it, it said: Container 'docker' exceeded memory limit. I've tried all possible combinations of size: 1x and 2x and memory: 1024 to 7128 but it didn't work.
This is my last bitbucket-pipeline.yml:
image: node:12
pipelines: default:
- step: size: 1x
caches:
- node
script:
- npm install
- npm run test:unit
branches:
master:
- step:
size: 2x
caches:
- node
script:
- npm install
- npm run test:unit
# Build the image.
- docker build -t beautiful-frontend .
# use the pipe to push the image to AWS ECR
- pipe: atlassian/aws-ecr-push-image:1.0.2
variables:
# Access keys must to be change.
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
IMAGE_NAME: beautiful-frontend
services:
- docker
definitions:
service:
docker:
# Memory docker-in-docker
memory: 7128
Did you catch the news at Team ‘25? With Loom, Confluence, Atlassian Intelligence, & even Jira 👀, you won’t have to worry about taking meeting notes again… unless you want to. Join us to explore the beta & discover a new way to boost meeting productivity.
Register today!Online forums and learning are now in one easy-to-use experience.
By continuing, you accept the updated Community Terms of Use and acknowledge the Privacy Policy. Your public name, photo, and achievements may be publicly visible and available in search engines.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.