Forums

Articles
Create
cancel
Showing results for 
Search instead for 
Did you mean: 

YAML Anchor step failing because an unrelated service has too much memory assigned

kbaley January 7, 2024

I have the following pipelines.yml (modified and redactedfor clarity):

 

definitions:
services:
docker:
memory: 7168
steps:
- step: &buildApi
name: Build the API
image: mcr.microsoft.com/dotnet/sdk:6.0
script:
- apt-get update && apt-get install zip -y
- cd api
- dotnet restore
- dotnet build --no-restore
- dotnet publish
- cd MyApi/bin/Debug/net7.0/publish
- zip -r ../../../../../../enterpriseApi.zip ./**
artifacts:
- api.zip
- step: &deployApi
name: Deploy API to Azure
script:
- pipe: microsoft/azure-web-apps-deploy:1.0.0
variables:
AZURE_APP_ID: $MOO
AZURE_PASSWORD: $MOO
AZURE_TENANT_ID: $MOO
AZURE_RESOURCE_GROUP: 'my-group'
AZURE_APP_NAME: 'my-app'
ZIP_FILE: 'api.zip'
pipelines:
custom:
build-and-deploy-docker-images:
- step:
size: 2x
image: node:18-alpine
caches:
- node
script:
- docker build -t $MYREGISTER/moo:$BITBUCKET_BUILD_NUMBER -t $MYREGISTER/moo:latest .
- docker push $MYREGISTER/moo:$BITBUCKET_BUILD_NUMBER
- docker push $MYREGISTER/moo:latest
services:
- docker
deploy-api:
- step: *buildEnterpriseApi
- step: *deployEnterpriseApi

 

Basically, two distinct pipelines: One to build a Docker image and one to deploy to Azure. The Docker one requires some extra memory because we were hitting the limits occasionally with the default size (3072, I think) so we bumped it and made that piepline a 2x one. The deploy-api step normally can run within the default sized container.

The docker pipeline is new and since adding it, the deploy-api step fails with the error:

The Deploy API to Azure step doesn’t have enough memory to run. It is a 1x size step with 4096MB of memory, and 7168MB is allocated to defined step services. 1024MB is required to run the step scripts. You will need to reduce the memory requirements of the services on the step by 4096MB.

My reading of this is that it thinks the deployEnterpriseApi step requires the docker service, which it doesn't. Oddly, it's failing on the second named step in the process so I'm wondering if there is something in the microsoft/azure-web-apps-deploy pipe that is causing this?

1 answer

1 accepted

1 vote
Answer accepted
kbaley January 7, 2024

To answer my own question, I think it was a naming conflict with the azure-web-apps-deploy pipe. I changed the definition to:

  services:
docker-7g:
type: docker
memory: 7168

And updated the pipeline that referenced it to use docker-7g as the name and it worked.

Patrik S
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
January 9, 2024

Hello @kbaley ,

great that you were able to solve the issue!

Just sharing some context here, bitbucket pipes use docker service under the hood. So although you don't explicitly define a docker service in the step where you are using the pipe, bitbucket will automatically add the docker service in the background, as docker is required for the pipe to be executed.

That being said, the step where you were using the pipe was size: 1x (that has 4GB of memory available), and the docker service (used by the pipe) was configured to use 7GB of memory. So when the pipeline starts, the step has 4GB available and the docker service tried to get 7GB out of those 4GB, which causes the memory error you reported. 

Defining a docker service with a different name, as you did, is one of the solutions to this issue. You can learn more about how memory is allocated in the pipeline in the following article : 

Thank you, @kbaley !

Patrik S

Like Sabine Mayer likes this

Suggest an answer

Log in or Sign up to answer
DEPLOYMENT TYPE
CLOUD
TAGS
AUG Leaders

Atlassian Community Events