Come for the products,
stay for the community

The Atlassian Community can help you and your team get more value out of Atlassian products and practices.

Atlassian Community about banner
Community Members
Community Events
Community Groups

Dynamic Branch-Specific Pipeline Cache and Cache Size Limits

Hi community,

we are currently in the midst of a phase of heavy pipeline performance improvements. During this process we came across several issues regarding caches:

  • Caches are repository global and not branch specific, meaning once you update a cache with say dependencies from a feature branch you may cause bugs and/or issues with production-ready code that uses the same cache(s) and pipeline(s).
  • There is no out of the box update for dependency changes regarding caches. We are aware of this proposed solution regarding automatically refreshing caches upon dependecy changes, however this still is limited by point #1.

There is also this open ticket dating back all the way 2018 with engaged discussions up to last month regarding cache refreshing. 

In the discussion thread of said ticket, a fellow community member proposed a workaround with adding unique hashed endings for checksum-tests regarding files that need caching (in his case yarn lock files) and thereby allowing for individual caching.

Leaning on this solution we have implemented our own approach of using the branches name and some scripting to on-the-fly generate a new bitbucket-pipelines.yaml upon commit that let's us have branch-specific caches for pnpm and node_modules.

#bitbucket pipeline template

pnpm-<branch-name>: $BITBUCKET_CLONE_DIR/.pnpm-store
node-<branch-name>: node_modules
# some-other-nested-node_modules-here

# <branch-name> will be replaced by a hash of the branches name and an internal prefix
During our tests this works as intended and so far we did not face any problems, however we are generating about 0,5 GB of branch-specific caches. Thus, we are now facing the lingering question:
  • Is there a MAXIMUM cache size per repository, if so what is its size?
  • Also is there a way to dynamically clear ALL caches present in a repository by script/pipeline run and not by using the caches popup and pressing the delete-button for each cache.

Input is very welcome, thanks in advance!

Best regards

1 answer

1 accepted

0 votes
Answer accepted

Hi, @Cengiz Deniz Thanks for reaching out to Atlassian Community!

Only caches under 1GB once compressed are saved. we have a feature request - to Increase the cache limit.

More details about caching can be found at:

You can use the combination of list and delete cache API endpoints to list the cache and delete all of them from the pipelines.


Let me know if this helps.



Hi @Suhas Sundararaju 

regarding cache sizing: We are aware of cache size limits during build teardown and compressing. What we are interested in is wether there is a limit on overall cache size per repository. 

Going from our branch-specific caching we now have setup up with our workaround, we would have about 0.5-0.7 GB of caches (17 in total, ranging from some hundred kB to up to 250 MB) per branch. However there are multiple developers working on various branches and various pipelines using caches are being run (i.e.for testing and before pull requests), so we would quickly have multiples of these 0.5-0.7 GB sized caches per branch. So is there any limit? There is nothing in the documentation (or we simply didn't find it).

As for the API based approach for deleting caches: thanks for that hint, we'll have a look at that and fiddle around a bit :)

Thanks and best regards

Hi @Cengiz Deniz 

There is no restriction on overall cache size per repository. you can create any number of node_modules branch-specific caches. But only caches under 1GB are compressed and saved.


Suggest an answer

Log in or Sign up to answer

Atlassian Community Events