My pipelines uses atlassian/bitbucket-upload-file 0.7 to upload artifacts to Bitbucket Downloads.
I got error "Container 'docker' exceeded memory limit." ("error waiting for container: unexpected EOF" in build log) when total size of artifacts reached around 510MB (in 5 files).
Is it possible to fix the issue? Or I need to create my own script to upload files in a more memory efficient way.
Hi Rex,
I've noticed you have raised this in community, yet you have access to a Standard workspace that is entitled to Standard-level support from our team. In future, please raise a ticket directly with us by listing your workspace URL and selecting Bitbucket Cloud as the product with Technical Issues & Bugs as the reason:
If you are receiving a Pipelines error that you have run out of memory, chances are you are not assigning enough memory to that step. Please refer to our documentation for steps to combat this - particularly the Service memory limits section which outlines how memory is allocated to your builds:
If you still encounter issues, please raise a ticket directly with support as above so that we may assist you further. If you cannot raise a ticket, let me know and I will raise one on your behalf.
Cheers!
- Ben (Bitbucket Cloud Support)
Thanks, I have raised a ticket in your system.
Actually, I don't expect uploading files using official uploading pipe requires adjustment in allowed memory.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Recommendation from the support person is increasing allowed memory or using CURL (not pipe). I have no idea if the pipe will be improved to allow uploading large file.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.