Come for the products,
stay for the community

The Atlassian Community can help you and your team get more value out of Atlassian products and practices.

Atlassian Community about banner
Community Members
Community Events
Community Groups

Is there an option to setup artifact storage for step artifacts?

Good day

Is there any possibility to specify some custom server for artifact storing? That could significantly increase build process.

Other nice-to-have option is artifact download specification, could be extremely useful in parallel builds — now following steps download all artifacts from all previous steps, which causes not necessary network and io overhead


Thank you for your product, hope it will grow over time!

1 answer

1 accepted

0 votes
Answer accepted

Hi Alexey, welcome to the community!

It's good to hear you're enjoying our product and thank you for the feedback.

I'm afraid that it is not possible to set up a custom server for artifact storage for files defined as artifacts in the yaml file. You could upload any such files to another server during the step that generates them, and then download them from that server in subsequent steps. However, this approach would not involve defining the files as artifacts in the yaml file.

May I ask why you're looking for this feature? What would be the benefit of having a custom server for artifacts? I'm just trying to understand your use case a bit better.

It is possible to prevent a step from downloading artifacts by adding the following in the yaml file for this step:

download: false

You can see an example on this documentation page. Is this what you are looking for?

Kind regards,

Hi Theodora!


Alas, solution you offer is inappropriate for me since it can't be used effectively on parallel steps. I'd like to build several docker containers with individual services, then push them to registry. I thought of creating parallel builds for each of these containers, and then publish each artifact to registry separately. This could potentially speed up build

Such scenario requires specifying artifacts needed to be acquired from storage to avoid downloading all artifacts from all parallel steps at once.

Maybe you have already guessed why I am asking about designated artifact storage in intranet: all this byte-sending activity may be quite time- and traffic-consuming. Actually I am pretty sure no one needs these artifacts outside of our intranet.


Another downside with parallel builds is that they are all started at once and all are trying to get same data from internet. This is quite ineffective since all these threads are clashing with each other and are fighting for bandwidth. It would be useful to start them not at once but sequentially with 10-30 sec delay, that could help with bandwidth and processor utilization.


I thought about manual artifact upload, but I hoped that is already solved problem with some minor setup like docker service in intranet

Hi Alexey,

I'm afraid that your use case is not very clear to me.

When you say 'artifacts', are you referring to files that are generated during your build, and that you have defined as artifacts in your yaml file as per this documentation?
Or are you referring instead to files you download from a third-party server during your build?

It would be useful to have an example of the yaml file you are using (you can mask any private/sensitive info) and an explanation of which artifacts you are referring to in this yaml file, so we can better help you.

Kind regards,

Good evening Theodora


I do not download any file during by builds except for nuget/npm packages

When I refer to 'artifacts' I mean something that was produced by pipeline and that may require transfer to another build step. Standard way to do this is mark build result as artifact in 'artifact' section of yaml file.

But tricky thing is, some artifacts are required on one steps and other are required on another. E.g., I may have several pipelines inside one file. First one is pull-request-triggered build. Second is master-merge-triggered build. Third is custom step which performs not only build, but publish as well.

To avoid duplicating code, I made step definitions, which are used in all three scenarios. First step is setup, second step for build, third publishes containers built on 2nd step to registry, and forth deploys containers published on third step on environment.

In this scenario I need transfer for some setup (.bashrc file, to be precise) from 1st step to all other steps, tar-red containers from 2nd step to 3rd, and some deploy script from 1st step to 4th.

Now all files are marked as artifacts and retain incrementally between steps (everything produced on 1st steps transfers to 2nd, then everything from 2nd transfers to 3rd and so on), thus increasing load on artifact storage and network. And since tar-red containers are very heavy (I have 5 artifacts total size of 500Mib) and they are transferring among steps where I do not need them (but need other artifacts), I'd like to have an option to turn off artifact downloading for some files. Or, perhaps, move artifact storage more closely to build servers.


That was my intention.

Does that makes sense for you now?

Norbert C Atlassian Team Nov 23, 2021

Hi Alexey,

Let me step in, I'm Norbert and I'm also a Bitbucket Cloud Support Engineer, it's nice to meet you!

Thank you for your detailed explanation as now, we have a better understanding what you would like to achieve.

I would like to inform you that unfortunately what you would like to achieve is not feasible at this moment, because Bitbucket Pipelines has only two options regards to Artifacts, which is either to download the artifacts or don't. We yet to have the functionality to allow users to decide which artifacts should be downloaded in the step.

Currently we have the following two options:

 download: false # do not download artifacts in this step

 download: true # download ALL the artifacts in this step

In your use-case the only workaround which I can think of is to upload the artifacts at the end of the script to your internal/external server and once those artifacts are needed, download those artifacts from your server/storage.

Please let me know if my explanation was clear.

Have a great day ahead!

Best Regards,

Atlassian Bitbucket Cloud Support

Hi Norbert, glad you joined discussion!


I am happy what I managed to share my perspective with you. Thank you for confirming that there are neither advanced artifact options nor personal artifact storage.


Well, it seems that I should manage this on my own. But I'd like to note that that could be very valuable option for pipelines since now lack of it limits usage of your product in controlled environment and in cases of limited bandwidth


Thank you for your product and hope it will grow further!

Norbert C Atlassian Team Nov 24, 2021

Good Morning Alexey,

You're welcome, we're always happy to help :). We also would like to thank your compliment to our product, we're constantly working to make Bitbucket Cloud a thriving product. 

I agree with you that it would be a useful feature, thus I've opened the following feature request:

BCLOUD-21497 - Ability to prevent a step from downloading a certain artifacts

Our development team will give a first-hand update on that ticket if there's any progress made so I would suggest keeping a watch and vote for it.


Do note however that there's no ETA on enhancement request, and all enhancements are implemented with this policy in mind: Implementation of New Features Policy

Have a wonderful week ahead!

Best Regards,
Atlassian Bitbucket Cloud Support

Thank you for update, Norbert!


It would be great to have this feature implemented.

Suggest an answer

Log in or Sign up to answer

Atlassian Community Events