Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in
Celebration

Earn badges and make progress

You're on your way to the next level! Join the Kudos program to earn points and save your progress.

Deleted user Avatar
Deleted user

Level 1: Seed

25 / 150 points

Next: Root

Avatar

1 badge earned

Collect

Participate in fun challenges

Challenges come and go, but your rewards stay with you. Do more to earn more!

Challenges
Coins

Gift kudos to your peers

What goes around comes around! Share the love by gifting kudos to your peers.

Recognition
Ribbon

Rise up in the ranks

Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!

Leaderboard

Come for the products,
stay for the community

The Atlassian Community can help you and your team get more value out of Atlassian products and practices.

Atlassian Community about banner
4,463,391
Community Members
 
Community Events
176
Community Groups

Node Cache shouldn't persist when there is a change in dependency

In our Bitbucket pipeline, we are caching the node modules.

caches:

- node

We are doing this to make sure we don't download the node modules if it is already present in the cache.

But when there is a change in the package.json dependency (added a new npm package and used in the components), we are expecting the pipeline to re-fetch the node modules instead of taking it from the cache.

What happens currently is: when there is a cache for node modules exists in pipeline irrespective of the current status of package.json, the node cache is being used. Ideally, node modules has to be refreshed in refernce to the package.json

So, the pipeline fails. When we delete the node cache manually and re-run the same pipeline, it gets succeeded.

 

Thanks in advance for your help.

4 answers

Coming in late, but this is what we currently do:

pipelines:
branches:
staging:
-step:
caches
:
- npm

definitions:
caches:
npm: ~/.npm

This uses the internal npm cache which will download dependencies if they have updated since last run.

I can't see any progress on this issue. this is a system breaking issue... If a dependency is updated, bitbucket takes the cache instead of updating all dependencies. this results in a pipeline failure.

@Abhenavh Prakash @Jardel Weyrich @Steven Vaccarella @Mike Hesler 

Are there any updates on this issue? Have you found out any third party pipes or something other that does it for you? 

0 votes

Hi Abhenavh,

I don't know much about node, but my understanding is that if you update a dependency in package.json (eg. add a new package that you haven't used before or update to the latest version of an existing package) then that new dependency won't be present in the cache (and so it will have to be downloaded). Are you saying that the cache is giving you the wrong version of a package (ie. with a different version number from the one you were expecting)?

Node Cache in the BitBucket pipeline is not getting updated. It remains the same even if we update the dependencies. So, we have to manually delete the Node Cache and re-run the pipeline build to make it succeed.

Ok, I think I understand now. You're saying that the build works but it needs to download the new dependencies every time because the cache doesn't get updated? That's expected behaviour. By design the cache only gets updated automatically once per week. If you want to update it earlier then you need to manually delete it. We don't currently support automatic cache updates because it's actually a bit tricky to do reliably (many build tools will make minor updates to their caches on every build, such as by changing timestamps, even though there are no changes to dependencies, so detecting when a cache has changed generically across all tools is not trivial).

But we do have an open (and highly voted) issue for improving the cache system: https://bitbucket.org/site/master/issues/16314/refresh-caches-when-dependencies-are

I'd recommend you watch and vote on that issue.

Yes. Whenever there is a change in the "package.json" the node cache has to be updated automatically. If the node cache is not updating automatically, then we have to delete the node cache manually and run the build. Then the build will succeed.

I'm just curious to know these:

  1. Is there any way to delete the node cache automatically?
  2. Is it mandatory to use 'node - cache' while building or we can skip it and download the node modules every time when we build?
Like Rob Bradshaw likes this
  1. You could potentially use the API to delete the cache if you can detect the right conditions within your build. Note that you would need to inject credentials using repository variables in order to authenticate with the API.
  2. No, it's not mandatory. You can download the modules every time if you want.

I'm still not certain I understand the problem you're seeing though. If you don't delete the cache, does the build fail? I don't understand why the build would fail. I would expect "npm install" to download the missing dependencies that aren't already in the cache.

The problem is that some node_modules binaries are only compatible with specific versions of node.  Running npm i on a cache from a different node version fails outright.

When we have different node versions targeted by the bitbucket-pipelines.yml file on different branches then only the branches with the same node version as the one the cache was initialized with will succeed.

Our workaround was to go through all our PRs and ensure the node version was consistent. Delete the cache and rebuild with the new targeted version.  Luckily for us this works and we don't change the node version very often.

However, I'm not sure this is a solution that will work for everyone.

It would be nice to tie the node cache type to the node version and create multiple caches for each combination.  This would avoid the problems altogether.

@Steven Vaccarella the major issue IMO is that the user cannot tell the pipeline when to USE the cache vs when to UPDATE it. I'd like to USE IT and UPDATE it in the same step, but currently it doesn't work like that. If there's a cache for node, for example, it can use it but any changes to it won't update the cache. And therefore we have no control over it.

With this distinction (use vs create/update) we could workaround the issue of updating the cache when there's a change in dependencies. Just a matter of hashing the package[-lock].json and comparing the hashes (the current and the cached one).

Suggest an answer

Log in or Sign up to answer
TAGS

Atlassian Community Events