I have a React app that has multiple steps in the pipeline. Two of these steps seem to require conflicting memory limits for the docker service.
The two steps are:
The first step runs static analysis using Sonarcloud:
- step: &StaticAnalysis name: Static Analysis with SonarCloud
image: atlassian/default-image:2 #quickest image
size: 2x
script:
- pipe: sonarsource/sonarcloud-scan:1.2.1
- pipe: sonarsource/sonarcloud-quality-gate:0.1.4
This step will fail with an error "Container 'docker' exceeded memory limit." without increasing the docker service memory limit as such:
definitions:
services:
docker:
memory: 2048
However, when I increase the docker memory limit, the next step fails with the error: "Container 'Build' exceeded memory limit.". If I remove the above service definition, the build step passes fine.
I tried putting:
services:
docker:
memory: 2048
Under the steps, but this produced a yaml error from the pipeline saying that services is expected to be a list, not a map.
Is there a way for me to configure the docker memory limit based off the step?
Hi @Shane McNamara,
I'm afraid that it is not possible to configure memory for services on a step level. The memory that you configure for a service will be the same in all steps that use this specific service (in this case docker).
We have a feature request for allowing configuration of service memory on a step level:
Since you'd be interested in that, I would suggest that you add your vote in the feature request (by selecting the Vote for this issue link) as the number of votes helps the development team and product managers better understand the demand for new features. You are more than welcome to leave any feedback, and you can also add yourself as a watcher (by selecting the Start watching this issue link) if you'd like to get notified via email on updates.
Implementation of new features is done as per our policy here and any updates will be posted in the feature request.
You included in your question the definition of the first step, and I see that it includes size: 2x so it has 8GB of memory in total. If I understand correctly, the docker service needs 2048 MB of memory for this step to work, and if you allocate less than 2048 MB the step fails, is this the case?
Regarding the second step, I assume it also uses the docker service? Does this second step succeed when you allocate to the service less than 2048 MB of memory, and fail when you allocate 2048 MB? Are you using size: 2x for this step as well? I just want to make sure I understand what happens, so we can see if there is a way around this.
Kind regards,
Theodora
Hi Theodora,
Both steps are using `size: 2x`.
Step 1 requires 2048 MB of memory and fails with less.
Step 2 requires less than 2048 MB of memory and fails with more.
Heres what the YAML looks like in part:
definitions:
services:
docker:
memory: 2048
caches: sonar: ~/.sonar/cache # Caching SonarCloud artifacts will speed up your build
steps:
- step: &StaticAnalysis
name: Static Analysis with SonarCloud
image: atlassian/default-image:2 #quickest image
size: 2x
script:
- pipe: sonarsource/sonarcloud-scan:1.2.1
- pipe: sonarsource/sonarcloud-quality-gate:0.1.4
caches:
- docker
services:
- docker
- step: &BuildDeploy
name: Build and Deploy
script:
- npm run build
I'm not explicitly setting service: docker in the second step (BuildDeploy), but the outcome of the step changed when modifier the docker service memory definition.
I was able to get the BuildDeploy step to work by disabling Source Maps in my build process, but I would really prefer to have them.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi @Shane McNamara,
Thank you for the info.
If the docker service is not used in the step "Build and Deploy", it shouldn't affect the memory for that step.
Right now the docker service has a memory of 2048 MB, you have disabled Source Maps in your build process. Does the build step still have size: 2x (just checking, because I don't see it in the part of yaml you copy-pasted here).
My suggestion for troubleshooting this would be to include the following commands at the beginning of the script of the build step:
- while true; do ps -aux && sleep 30; done &
- while true; do echo "Memory usage in megabytes:" && echo $((`cat /sys/fs/cgroup/memory/memory.memsw.usage_in_bytes | awk '{print $1}'`/1048576)) && sleep 0.1; done &
Enable source maps as well. The commands I gave above are going to print memory usage in the Pipelines log while the step is executed, which will give us some insight on what processes are consuming a lot of memory during this step, causing it to fail.
Please feel free to attach here the Pipelines log (or part of it, sanitizing any sensitive info) so we can check the memory usage.
Kind regards,
Theodora
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi @Theodora Boudale
Sorry for the delay. I've ran the pipeline with the script steps you suggested. Some observations:
Thanks for all the help
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
@Theodora Boudale
Something interesting has happened, regarding the docker service. When I'm running both steps, only the steps that explicitly set:
services:
- docker
receive the new docker memory limit. However, if I'm only running a single step, the step receives the docker limit, even if I have not set the service.
With this YAML:
definitions:
services:
docker:
memory: 4096
caches: sonar: ~/.sonar/cache # Caching SonarCloud artifacts will speed up your build
steps:
- step: &StaticAnalysis
name: Static Analysis with SonarCloud
image: atlassian/default-image:2 #quickest image
size: 2x
script:
- pipe: sonarsource/sonarcloud-scan:1.2.1
- pipe: sonarsource/sonarcloud-quality-gate:0.1.4
caches:
- docker
services:
- docker
- step: &BuildDeploy
name: Build and Deploy
size: 2x
script:
- npm run build
The pipeline will pass when both are run sequentially (StaticAnalysis, followed by BuildDeploy), even when "BuildDeploy" is showing 6+GB of memory usage.
When I run just the "BuildDeploy" step, the build fails with the "Container 'Build' exceeded memory limit." error when the step hits 4GB memory usage (which implies to me the docker limit of 4GB is being utilized).
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Third update:
I have to branches running identical code/pipelines and one fails consistently and the other passes consistently. Any idea whats going on here? Is there some branch/commit specific caching happening?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi, @Shane McNamara!
Theoretically, since you are not using a docker service in &buildDeploy, it should give you the full 7 GB for the build container. In this case, we will need additional details to further investigate this issue:
That would give us a better understanding of this case. Thank you!
Kind regards,
Caroline
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi @Caroline R
So it appears as if the size parameter is not being applied? The only difference between the two bottom screenshots is that they were run on different branches.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi, @Shane McNamara!
Thanks for getting back to us and for providing the additional details. In order to further investigate this issue, we'll need to analyze this YAML, so I created a ticket on your behalf to our team. You should receive an email with this info and we'll contact you to work on this case.
Please let me know if you have any questions.
Kind regards,
Caroline
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.