Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Bitbucket pipeline docker build hanging when out of memory

josh_hayden
Contributor
July 10, 2024

Hi, I'm trying to put together a Bitbucket pipeline that does a docker build. I was surprised when what takes 5 minutes locally was getting stuck for 27 minutes in the Bitbucket pipeline.

When searching for a possible cause, I came across this article: https://confluence.atlassian.com/bbkb/bitbucket-pipeline-execution-hangs-on-docker-build-step-1189503836.html

 

Sure enough, I increased the docker memory limit in my bitbucket-pipelines.yml file, and it now runs as expected.

I consider this a serious bug in Bitbucket Pipelines that running out of memory in docker build would cause the pipeline to hang rather than to immediately fail. In fact, I will be looking to move away from using Bitbucket Pipelines unless this is resolved. I can't afford to have a pipeline hang indefinitely due to a bug of Bitbucket not correctly detecting docker has run out of memory.

Anyone else facing this or know if Atlassian is planning to fix this?

2 answers

1 accepted

1 vote
Answer accepted
Patrik S
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
July 11, 2024

Hello @josh_hayden ,

and thank you for reaching out to the Community!

I see you opened a support ticket with us, but I would just like to clarify here in community as well, in case other users come across the same issue.

Bitbucket Pipelines actually tries to identify when the build container or any service container (like the docker service) runs out of the memory limit, immediately stopping the build in such cases and presenting a message similar to the one below :

Container 'docker/build' exceeded the memory limited

Those are scenarios discussed in the Troubleshooting Pipelines article .

However, the identification of those scenarios depends very much on the commands you are running. In some scenarios, instead of the command aborting due to a memory limit, the command hangs and keeps trying to allocate more memory but does not return an exit code for pipelines to identify such a scenario, leading to the situation where the build "hangs."

The occurrence of this behavior is usually related to memory issues, so after the memory is increased, you shouldn't experience "stuck" pipelines going forward.

However, to avoid too many build minutes being used when such a situation happens, you can also implement a custom timeout for your steps leveraging the max-time flag in your step definition. Once the step reaches the defined max-time, it will be shut down with a timeout.

I hope that helps! Should you have any questions, feel free to ask.

Thank you, @josh_hayden !

Patrik S

0 votes
Nicolas Grossi
Banned
July 10, 2024

@josh_hayden You might contact support.atlassian.com as you have an standard license.

 

Nicolas

Gajesh Bhat
Contributor
July 10, 2024

Hello @josh_hayden . Welcome to the Bitbucket community. Have you tried increasing the memory at the step level and the step size ? If you can post a redacted version of the pipelines file, then I can help probe this issue further. You need to execute the command 

- export DOCKER_BUILDKIT=0 before building the docker image. Docker in Docker is the preferd way to build docker images using Bitbucket Pipelines.  Following links might help.
Like Nicolas Grossi likes this

Suggest an answer

Log in or Sign up to answer
DEPLOYMENT TYPE
CLOUD
PRODUCT PLAN
STANDARD
PERMISSIONS LEVEL
Product Admin
TAGS
AUG Leaders

Atlassian Community Events