Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Earn badges and make progress

You're on your way to the next level! Join the Kudos program to earn points and save your progress.

Deleted user Avatar
Deleted user

Level 1: Seed

25 / 150 points

Next: Root


1 badge earned


Participate in fun challenges

Challenges come and go, but your rewards stay with you. Do more to earn more!


Gift kudos to your peers

What goes around comes around! Share the love by gifting kudos to your peers.


Rise up in the ranks

Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!


Container 'Build' exceeded memory limit nodejs



Screenshot 2022-07-18 at 2.38.46 PM.png

I am trying to build and deploy my nextjs monorepo, I am not using any extra services and only the setting the default bit bucket docker's node image. 

Why am i getting "Container 'Build' exceeded memory limit.", anyway the extend my build container memory?

** my AppB is smaller app so it deployed with no problem, it just App A failed

below is my pipeline file:

image: node:16.16.0



    yarn: /usr/local/share/.cache/yarn

    nodecustom: ./node_modules

    nextcacheAppA: ./dist/apps/appA/.next/cache

    nextcacheAppB: ./dist/apps/appB/.next/cache




      - parallel:

        - step:

          size: 2x


            - yarn

            - nodecustom

            - node

            - nextcacheshell

         name: Build and Deploy App A SIT

         deployment: SIT


           - yarn install --frozen-lockfile && export NETLIFY_SITE_ID=$SIT_SITE_ID && npx netlify deploy --build --prod --site $SIT_SITE_ID --auth $NETLIFY_AUTH_TOKEN

       - step:

          name: Build and Deploy App B SIT


            - yarn install --frozen-lockfile && export NETLIFY_SITE_ID=$APPB_SIT_SITE_ID && npx netlify deploy --build --prod --site $APPB_SIT_SITE_ID --auth $NETLIFY_AUTH_TOKEN

1 answer

0 votes
Theodora Boudale
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
Jul 19, 2022

Hi @andy_tsoi and welcome to the community.

I see that you have created a support ticket for this issue, but I wanted to leave a reply here as well for any other users who may come across your post facing the same issue.

  • Each step in a Pipelines build has 4 GB of memory available.
  • If you use size: 2x for a step, like in the yml file you posted, this step will have 8 GB of memory available. This is the maximum supported in Pipelines builds. Please note that 2x pipelines will use twice the number of build minutes.

  • You can debug the build locally as per this doc to check memory usage and see if you can configure your build to use less memory.
  • Once you have set it up and the build is running locally, you can use the docker stats command (you need to run this in a separate window) and print the output to a file in a loop
while true; do date >> mem.txt && docker stats -a --no-stream >> mem.txt && sleep 2; done &
  • If certain steps need more than 8 GB of memory, you can look into using Runners for these steps, as with Runners you can configure up to 32GB (8x) of memory.

Kind regards,

Suggest an answer

Log in or Sign up to answer
AUG Leaders

Atlassian Community Events