Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Assigning additional memory to Docker in a pipeline

itsjamesd October 9, 2019

Sorry if someone has asked this elsewhere, but I've not been able to find the answer to my issue.

I'm working with a repo that has lots of different pipelines defined for different circumstances. I'm getting a memory error on one of my builds: Container "Build" exceeded memory limit. I can find a ticket on the Atlassian boards about this - it suggests upping the amount of memory available to Docker, using the following syntax in the definitions block of the pipeline file:

services:
docker:
memory: 2048

So far so good - but it doesn't seem to have any effect on the build - I still get the same error. There's nowhere in the pipelne where Docker is declared as a service - just an initial step that gets the image, like this:

image: 'my-docker-hub/ngcli:ng8-latest'

(NB - until fairly recently, these builds worked fine, so I don't think it has to do with the way we're using Docker).

I've tried moving the image out from the root of the file and into the docker services step, so I've got a block like this:

services:
docker:
image: 'my-docker-hub/ngcli:ng8-latest'

memory: 2048

But that gets me this error:

The service "docker" is a system defined service. The "image" is not modifiable.

I don't really know what to do next. Can anyone help?

1 answer

0 votes
Peter Plewa
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
October 13, 2019

Hi @itsjamesd,

A step in Pipelines has 4GB of available memory by default. You can increase this to 8GB by running 2x builds: https://blog.bitbucket.org/2018/02/20/support-large-builds-bitbucket-pipelines/

When you add a service to a step it will consume 1GB by default. For example a step with a single service leaves 3GB for the rest of the build. The build space must have at least 1GB to run.

You can configure a service to use anywhere from 128 to 3072 MB (or 7128 MB for 2x builds). This allows you to dedicate less memory to services and more to the build space as required. Please see "Service memory limits" in our documentation for more information: https://confluence.atlassian.com/bitbucket/use-services-and-databases-in-bitbucket-pipelines-874786688.html

Thanks,

Peter

marvec May 21, 2020

My build pipelines are constantly hitting out of memory errors. I spent many days trying to tweak that out, but I am running out of ideas. I am already using the size: 2x, setting my docker build process to max. allowed memory (7128), I only need to build the docker image. The problem is that my node.js application compiles inside of that docker build process. I cannot squeeze the node.js in under 2475 MB (if less it fails on segfault). I don't understand why docker would need so much extra memory. But anyway, it used to be quite stable until like last week or two. Crashing frequently on OOME. It seems like I really cannot get in those 8GB. Is there a way to get some more memory? I already pay for build time, so I understand this would come at a cost. I just need that option. I do not want to migrate elsewhere again. Please!

Like # people like this

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events