Sorry if someone has asked this elsewhere, but I've not been able to find the answer to my issue.
I'm working with a repo that has lots of different pipelines defined for different circumstances. I'm getting a memory error on one of my builds: Container "Build" exceeded memory limit. I can find a ticket on the Atlassian boards about this - it suggests upping the amount of memory available to Docker, using the following syntax in the definitions block of the pipeline file:
So far so good - but it doesn't seem to have any effect on the build - I still get the same error. There's nowhere in the pipelne where Docker is declared as a service - just an initial step that gets the image, like this:
(NB - until fairly recently, these builds worked fine, so I don't think it has to do with the way we're using Docker).
I've tried moving the image out from the root of the file and into the docker services step, so I've got a block like this:
But that gets me this error:
The service "docker" is a system defined service. The "image" is not modifiable.
I don't really know what to do next. Can anyone help?
A step in Pipelines has 4GB of available memory by default. You can increase this to 8GB by running 2x builds: https://blog.bitbucket.org/2018/02/20/support-large-builds-bitbucket-pipelines/
When you add a service to a step it will consume 1GB by default. For example a step with a single service leaves 3GB for the rest of the build. The build space must have at least 1GB to run.
You can configure a service to use anywhere from 128 to 3072 MB (or 7128 MB for 2x builds). This allows you to dedicate less memory to services and more to the build space as required. Please see "Service memory limits" in our documentation for more information: https://confluence.atlassian.com/bitbucket/use-services-and-databases-in-bitbucket-pipelines-874786688.html
My build pipelines are constantly hitting out of memory errors. I spent many days trying to tweak that out, but I am running out of ideas. I am already using the size: 2x, setting my docker build process to max. allowed memory (7128), I only need to build the docker image. The problem is that my node.js application compiles inside of that docker build process. I cannot squeeze the node.js in under 2475 MB (if less it fails on segfault). I don't understand why docker would need so much extra memory. But anyway, it used to be quite stable until like last week or two. Crashing frequently on OOME. It seems like I really cannot get in those 8GB. Is there a way to get some more memory? I already pay for build time, so I understand this would come at a cost. I just need that option. I do not want to migrate elsewhere again. Please!
Project managers know this problem: A “mountain of work” lays in front of you, and you don’t know how and where to tackle them. Different to-dos lie ahead, but just one task after the other can be ha...
Connect with like-minded Atlassian users at free events near you!Find an event
Connect with like-minded Atlassian users at free events near you!
Unfortunately there are no Community Events near you at the moment.Host an event
You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events