Showing results for 
Search instead for 
Did you mean: 
Sign up Log in
It's not the same without you

Join the community to find out what other Atlassian users are discussing, debating and creating.

Atlassian Community Hero Image Collage

Trying to understand pipelines caching

I'd really like to use pipelines caching, but I must admit to being confused as to how it works. I have a project I'm testing it on using the following pipelines script. The project uses both composer and node and has both composer.json and package.json files.


- step:
- composer
- composer install --no-dev
- composer run-script build
- zip -FSr ${BITBUCKET_REPO_SLUG} ./ -x@exclude.lst

The composer `build` script runs yarn install and then gulp. However, when I run this project through pipelines, it reports back that `Cache "composer": Not found`. What am I doing wrong here? Or am I simply not understanding how this is supposed to work?

FWIW I also tried caching node and that didn't work either.

3 answers

1 accepted

3 votes
Answer accepted
devpartisan Atlassian Team Sep 01, 2017

That's pretty much the way it should work. The first time you build with a new cache setting, it's going to say "not found". When you rerun the build, you should see the following in build setup:

Cache "node": Downloading
Cache "node": Extracting
Cache "node": Loaded

Then the cache has been primed for subsequent builds. Remember, the cache is dropped after 1 week.

I don't have a handy PHP project for testing so please do let us know if there's something wrong with the composer one.

I'm following the docs, but my composer cache is always empty. Do I need to create `~/.composer/cache` directory myself?

I have the same issue the composer dir is always empty

I never seem to get "extracting" or "loaded"

I rerun the pipeline and I get 

Cache "node": DownloadingCache "node": Not found


Cache "composer": DownloadingCache "composer": Not found

It never seems to load them, is there a step missing here?

please do let us know if there's something wrong with the composer one

Yes, there is something wrong: it does not work, as reported just above.

Best regards.

In my case, the problem was defining COMPOSER_HOME env variable. I removed such variable and my cache is now working

I don't have any environment variable defined (in

There may be environment variable defined in the Docker image I am using however...

Sorry if I was not clear: I was talking just about an env var defined in my docker image.

@Massimiliano Arione I have tried to use the default docker image, php:7.1.1, and php:7.1-cli.  All of them seem to have the same effect.

How did you unset the COMPOSER_HOME env variable?

Perhaps I need to us a custom docker image in order to do that?

We also are having trouble getting the Composer Cache to work, it is always empty. Tried with a custom cache directory and still no!

Cache "composer": DownloadingCache "composer": Not found

We are using image 

image: composer:2.0

How can we unset the COMPOSER_HOME env variable?

UPDATE: do not use this image. Changing to

image: pyguerder/bitbucket-pipelines-php74

fixed this for us. 

I got the same problem, whenever I run the build, it never extacts anything and it just keeps on saying that it can not download or find cache and the cache directory remains empty.

Any solutions yet?

For me the problems were:

The node_modules folder wasn't in the root directory. Therefore the standaard cache definition did not work. The problem was solved by assigning a custom path for the node_modules cache.

The docker image that I was using, did have a composer ENV variable in it. As stated in an answer above: "COMPOSER_HOME" Which broke the caching function of Pipelines. I removed it and created a custom image, without the cache variable.

Now it all works perfect!

Can you share the minimal directory structure and the cache definition? I am having trouble setting it up

Can we cache to reuse on other step, For example:

- step:
name: abc
image: node:8.11
- npm install
- tar zcvf abc.tar.gz *
- step:
name: def
image: atlassian/pipelines-awscli
- aws s3 sync --delete abc.tar.gz s3://def/abc.tar.gz

@Money Dinh

Did u find solution for this ?

that would be interesting for me, too


step one composer install, step 2 phpunit, step 3 behat tests

I think `artifacts` option is a solution here.

`artifacts` is better when used to store files between steps. It's always populated with the latest version.

As far as I know cache is only populated when it's empty.

Yes, I'm ok with artifact atm.

Suggest an answer

Log in or Sign up to answer
Community showcase
Published in Bitbucket

📣 Calling Bitbucket Data Center customers to participate in research

Hi everyone, Are you Bitbucket DC customer? If so, we'd love to talk to you! Our team wants to dive deep to understand your long-term plans regarding Bitbucket DC and Atlassian Cloud. Do you plan...

224 views 2 5
Read article

Community Events

Connect with like-minded Atlassian users at free events near you!

Find an event

Connect with like-minded Atlassian users at free events near you!

Unfortunately there are no Community Events near you at the moment.

Host an event

You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events

Events near you