Failed to upload files to Bitbucket Downloads

Rex Cheung March 4, 2024

My pipelines uses atlassian/bitbucket-upload-file 0.7 to upload artifacts to Bitbucket Downloads.

I got error "Container 'docker' exceeded memory limit." ("error waiting for container: unexpected EOF" in build log) when total size of artifacts reached around 510MB (in 5 files).

Is it possible to fix the issue? Or I need to create my own script to upload files in a more memory efficient way.

1 answer

1 accepted

1 vote
Answer accepted
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
March 5, 2024

Hi Rex,

I've noticed you have raised this in community, yet you have access to a Standard workspace that is entitled to Standard-level support from our team. In future, please raise a ticket directly with us by listing your workspace URL and selecting Bitbucket Cloud as the product with Technical Issues & Bugs as the reason:

If you are receiving a Pipelines error that you have run out of memory, chances are you are not assigning enough memory to that step. Please refer to our documentation for steps to combat this - particularly the Service memory limits section which outlines how memory is allocated to your builds:

If you still encounter issues, please raise a ticket directly with support as above so that we may assist you further. If you cannot raise a ticket, let me know and I will raise one on your behalf.


- Ben (Bitbucket Cloud Support)

Rex Cheung March 5, 2024

Thanks, I have raised a ticket in your system.

Actually, I don't expect uploading files using official uploading pipe requires adjustment in allowed memory.

Like Sabine Mayer likes this
Rex Cheung March 8, 2024

Recommendation from the support person is increasing allowed memory or using CURL (not pipe). I have no idea if the pipe will be improved to allow uploading large file.

Suggest an answer

Log in or Sign up to answer
AUG Leaders

Atlassian Community Events