Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Container 'Build' exceeded memory limit nodejs

andy.tsoi July 18, 2022

Hi,

Screenshot 2022-07-18 at 2.38.46 PM.png

I am trying to build and deploy my nextjs monorepo, I am not using any extra services and only the setting the default bit bucket docker's node image. 

Why am i getting "Container 'Build' exceeded memory limit.", anyway the extend my build container memory?

** my AppB is smaller app so it deployed with no problem, it just App A failed

below is my pipeline file:

---
image: node:16.16.0

definitions
:

  caches:

    yarn: /usr/local/share/.cache/yarn

    nodecustom: ./node_modules

    nextcacheAppA: ./dist/apps/appA/.next/cache

    nextcacheAppB: ./dist/apps/appB/.next/cache

pipelines:

  branches:

    master:

      - parallel:

        - step:

          size: 2x

          caches:

            - yarn

            - nodecustom

            - node

            - nextcacheshell

         name: Build and Deploy App A SIT

         deployment: SIT

         script:

           - yarn install --frozen-lockfile && export NETLIFY_SITE_ID=$SIT_SITE_ID && npx netlify deploy --build --prod --site $SIT_SITE_ID --auth $NETLIFY_AUTH_TOKEN

       - step:

          name: Build and Deploy App B SIT

          script:

            - yarn install --frozen-lockfile && export NETLIFY_SITE_ID=$APPB_SIT_SITE_ID && npx netlify deploy --build --prod --site $APPB_SIT_SITE_ID --auth $NETLIFY_AUTH_TOKEN

1 answer

0 votes
Theodora Boudale
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
July 19, 2022

Hi @andy_tsoi and welcome to the community.

I see that you have created a support ticket for this issue, but I wanted to leave a reply here as well for any other users who may come across your post facing the same issue.

  • Each step in a Pipelines build has 4 GB of memory available.
  • If you use size: 2x for a step, like in the yml file you posted, this step will have 8 GB of memory available. This is the maximum supported in Pipelines builds. Please note that 2x pipelines will use twice the number of build minutes.

  • You can debug the build locally as per this doc to check memory usage and see if you can configure your build to use less memory.
  • Once you have set it up and the build is running locally, you can use the docker stats command (you need to run this in a separate window) and print the output to a file in a loop
while true; do date >> mem.txt && docker stats -a --no-stream >> mem.txt && sleep 2; done &
  • If certain steps need more than 8 GB of memory, you can look into using Runners for these steps, as with Runners you can configure up to 32GB (8x) of memory.

Kind regards,
Theodora

Suggest an answer

Log in or Sign up to answer
DEPLOYMENT TYPE
CLOUD
TAGS
AUG Leaders

Atlassian Community Events