Our pipeline consist of dozen steps and one step which builds next.js production docker image requires a lot of memory. So I added size: x2 to options section of bitbucket-pipelines.yml file and set definitions.services.docker.memory to 4096. Everything works but now all my steps are x2, I can't set x1 to other steps when I set x1 pipeline throws an error:
A step does not have the minimum resources needed to run (1024 MB). Services on the current step are consuming 4096 MB
It looks like docker memory limit is set on all steps and I only need it once. How can I give 4096MB to docker service only to one step in pipeline?
Hi @Bogdan and welcome to the community.
While it is possible to set the 2x option for a specific step only, as follows:
size: 2x # Double resources available for this step.
- echo "Build step"
I'm afraid that it is not possible to set different service memory for different steps. If you set the service's memory to 4096 MB, this will apply for any step where the service is used.
If you're using the docker service only in the step that requires a lot of memory, you could add the 2x option only for this step. However, if the docker service is used by other steps without the 2x option, then you will get an error.
If the service needs 4096 MB memory and it is used in more than one steps, then you'd need the 2x option in all the steps where the service is used.
We have a feature request to allow users to configure memory for services on a step level:
If you'd be interested in that, I would suggest that you add your vote in that feature request (by selecting the Vote for this issue link) as the number of votes helps the development team and product managers better understand the demand for new features. You are more than welcome to leave any feedback, and you can also add yourself as a watcher (by selecting the Start watching this issue link) if you'd like to get notified via email on updates.
Implementation of new features is done as per our policy here and any updates will be posted in the feature request.
Please feel free to let me know if you have any questions.
The only cases I can think of that a step would be affected by the docker service memory definition are:
- when a step uses `services: -docker`
- when a step uses a pipe:
Pipes need a Docker service to run and they will use a Docker service even if you don't define one in the step that uses the pipe. Is the step you are referring to using a pipe?
If you don't mind sharing your yml file publicly, you can create a new question in community and we can look into your specific case.
Alternatively, since you're a member of a workspace on a paid billing plan, you can create a ticket with Bitbucket Cloud support team to look into this. The support ticket will be visible to you and Atlassian staff, so you can share the URL of your affected build(s) and also which steps seem to be affected and the engineer working on your ticket can look into it. You can create a support ticket via https://support.atlassian.com/contact/#/, in "What can we help you with?" select "Technical issues and bugs" and then Bitbucket Cloud as product.
If you have any questions, please feel free to let me know.
Hello and thank your for the reply.
Yes, actually one of the steps is using a pipe. Since i did not understand the concept i assumed..
What is the correct approach here? I know i can define an alternative docker service with custom memory, but i don't understand how to keep caching working.