Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Docker build failing in bitbucket pipeline

naorsa
I'm New Here
I'm New Here
Those new to the Atlassian Community have posted less than three times. Give them a warm welcome!
February 26, 2024

Hi

I am using in my application the package -> aws-sdk

when I am importing S3 object from the package, the docker build in bitbucket is falling with the error SIGKILL 

and I don't know why its happening, on my local machine the docker build its complete fine

2 answers

1 accepted

1 vote
Answer accepted
Theodora Boudale
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
February 28, 2024

Hi @naorsa and welcome to the community!

I've seen this error occur when the memory available to the docker service is not enough, so that is a possible cause.

To use docker commands in your pipelines build, you define a docker service in your bitbucket-pipelines.yml file. Regular steps have 4096 MB of memory in total and each service has a 1024 MB default memory. If there are no other services in that step, you can increase that up to 3072 MB if you add in your yml file a definition as follows:

definitions:
services:
docker:
memory: 2048

Our documentation for reference:

If a step has size: 2x, the build container has 8192 MB memory in total and services can be configured to use up to 7128 MB memory (in total).

Please feel free to let me know if that helps or if you need further assistance.

Kind regards,
Theodora

0 votes
naorsa
I'm New Here
I'm New Here
Those new to the Atlassian Community have posted less than three times. Give them a warm welcome!
March 3, 2024

Hey Theodora,

 

thanks you very much, its help and the problem solved !

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events