locally on a laptop with 8gb ram and a 2 core CPU, the tests pass easily.
however the tests fail on bitbucket due to out of memory. Error message is:
"Container 'Build' exceeded memory limit."
Each step in a Pipelines build has 4 GB of memory available, unless you configure size: 2x, then the step will have 8 GB of memory available.
The build container is given 1024 MB of the total memory. If you are using services in a certain step, then each one gets 1024 MB memory by default, unless you have configured service memory differently in your yaml file.
If you subtract the above from the total memory of the step, you'll get the memory available for the build to run.
Could you give us some more details about this build, e.g. are you using any services in the step that fails? And are you using the size: 2x flag in that step?
You can try adding the following commands at the beginning of the script of the step that fails:
- while true; do ps -aux && sleep 30; done & - while true; do echo "Memory usage in megabytes:" && echo $((`cat /sys/fs/cgroup/memory/memory.memsw.usage_in_bytes | awk '{print $1}'`/1048576)) && sleep 0.1; done &
These will print memory usage during the build which can be useful for understanding what processes use a lot of memory.
You can also try debugging this locally with Docker, so that you don't consume more build minutes trying to debug this:
There is an example in the above documentation, which how you can set the memory limit when debugging locally to replicate Pipelines as closely as possible (you can set either 4 GB or 8 GB if the step has size: 2x).
Please feel free to let me know if you have any questions.
Kind regards,
Theodora
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.