Showing results for 
Search instead for 
Did you mean: 
Sign up Log in
It's not the same without you

Join the community to find out what other Atlassian users are discussing, debating and creating.

Atlassian Community Hero Image Collage

Node Cache shouldn't persist when there is a change in dependency

In our Bitbucket pipeline, we are caching the node modules.


- node

We are doing this to make sure we don't download the node modules if it is already present in the cache.

But when there is a change in the package.json dependency (added a new npm package and used in the components), we are expecting the pipeline to re-fetch the node modules instead of taking it from the cache.

What happens currently is: when there is a cache for node modules exists in pipeline irrespective of the current status of package.json, the node cache is being used. Ideally, node modules has to be refreshed in refernce to the package.json

So, the pipeline fails. When we delete the node cache manually and re-run the same pipeline, it gets succeeded.


Thanks in advance for your help.

2 answers

Coming in late, but this is what we currently do:

- npm

npm: ~/.npm

This uses the internal npm cache which will download dependencies if they have updated since last run.

0 votes

Hi Abhenavh,

I don't know much about node, but my understanding is that if you update a dependency in package.json (eg. add a new package that you haven't used before or update to the latest version of an existing package) then that new dependency won't be present in the cache (and so it will have to be downloaded). Are you saying that the cache is giving you the wrong version of a package (ie. with a different version number from the one you were expecting)?

Node Cache in the BitBucket pipeline is not getting updated. It remains the same even if we update the dependencies. So, we have to manually delete the Node Cache and re-run the pipeline build to make it succeed.

Ok, I think I understand now. You're saying that the build works but it needs to download the new dependencies every time because the cache doesn't get updated? That's expected behaviour. By design the cache only gets updated automatically once per week. If you want to update it earlier then you need to manually delete it. We don't currently support automatic cache updates because it's actually a bit tricky to do reliably (many build tools will make minor updates to their caches on every build, such as by changing timestamps, even though there are no changes to dependencies, so detecting when a cache has changed generically across all tools is not trivial).

But we do have an open (and highly voted) issue for improving the cache system:

I'd recommend you watch and vote on that issue.

Yes. Whenever there is a change in the "package.json" the node cache has to be updated automatically. If the node cache is not updating automatically, then we have to delete the node cache manually and run the build. Then the build will succeed.

I'm just curious to know these:

  1. Is there any way to delete the node cache automatically?
  2. Is it mandatory to use 'node - cache' while building or we can skip it and download the node modules every time when we build?
  1. You could potentially use the API to delete the cache if you can detect the right conditions within your build. Note that you would need to inject credentials using repository variables in order to authenticate with the API.
  2. No, it's not mandatory. You can download the modules every time if you want.

I'm still not certain I understand the problem you're seeing though. If you don't delete the cache, does the build fail? I don't understand why the build would fail. I would expect "npm install" to download the missing dependencies that aren't already in the cache.

The problem is that some node_modules binaries are only compatible with specific versions of node.  Running npm i on a cache from a different node version fails outright.

When we have different node versions targeted by the bitbucket-pipelines.yml file on different branches then only the branches with the same node version as the one the cache was initialized with will succeed.

Our workaround was to go through all our PRs and ensure the node version was consistent. Delete the cache and rebuild with the new targeted version.  Luckily for us this works and we don't change the node version very often.

However, I'm not sure this is a solution that will work for everyone.

It would be nice to tie the node cache type to the node version and create multiple caches for each combination.  This would avoid the problems altogether.

@Steven Vaccarella the major issue IMO is that the user cannot tell the pipeline when to USE the cache vs when to UPDATE it. I'd like to USE IT and UPDATE it in the same step, but currently it doesn't work like that. If there's a cache for node, for example, it can use it but any changes to it won't update the cache. And therefore we have no control over it.

With this distinction (use vs create/update) we could workaround the issue of updating the cache when there's a change in dependencies. Just a matter of hashing the package[-lock].json and comparing the hashes (the current and the cached one).

Suggest an answer

Log in or Sign up to answer
Community showcase
Published in Bitbucket

📣 Calling Bitbucket Data Center customers to participate in research

Hi everyone, Are you Bitbucket DC customer? If so, we'd love to talk to you! Our team wants to dive deep to understand your long-term plans regarding Bitbucket DC and Atlassian Cloud. Do you plan...

227 views 2 5
Read article

Community Events

Connect with like-minded Atlassian users at free events near you!

Find an event

Connect with like-minded Atlassian users at free events near you!

Unfortunately there are no Community Events near you at the moment.

Host an event

You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events

Events near you