Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Trying to understand pipelines caching

aberkow July 19, 2017

I'd really like to use pipelines caching, but I must admit to being confused as to how it works. I have a project I'm testing it on using the following pipelines script. The project uses both composer and node and has both composer.json and package.json files.

 

pipelines:
branches:
feature/cache-test:
- step:
caches:
- composer
script:
- composer install --no-dev
- composer run-script build
- zip -FSr ${BITBUCKET_REPO_SLUG}-cache-test.zip ./ -x@exclude.lst
- curl -u ${BB_AUTH_STRING} -X POST "https://api.bitbucket.org/2.0/repositories/${BITBUCKET_REPO_OWNER}/${BITBUCKET_REPO_SLUG}/downloads" --form files=@"${BITBUCKET_REPO_SLUG}-cache-test.zip"

The composer `build` script runs yarn install and then gulp. However, when I run this project through pipelines, it reports back that `Cache "composer": Not found`. What am I doing wrong here? Or am I simply not understanding how this is supposed to work?

FWIW I also tried caching node and that didn't work either.

3 answers

1 accepted

3 votes
Answer accepted
devpartisan
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
September 1, 2017

That's pretty much the way it should work. The first time you build with a new cache setting, it's going to say "not found". When you rerun the build, you should see the following in build setup:

Cache "node": Downloading
Cache "node": Extracting
Cache "node": Loaded

Then the cache has been primed for subsequent builds. Remember, the cache is dropped after 1 week.

I don't have a handy PHP project for testing so please do let us know if there's something wrong with the composer one.

Massimiliano Arione September 21, 2017

I'm following the docs, but my composer cache is always empty. Do I need to create `~/.composer/cache` directory myself?

Mike Hancoski October 9, 2017

I have the same issue the composer dir is always empty

I never seem to get "extracting" or "loaded"

I rerun the pipeline and I get 

Cache "node": DownloadingCache "node": Not found

or 

Cache "composer": DownloadingCache "composer": Not found

It never seems to load them, is there a step missing here?

Peter17 October 17, 2017

please do let us know if there's something wrong with the composer one

Yes, there is something wrong: it does not work, as reported just above.

Best regards.

Massimiliano Arione October 17, 2017

In my case, the problem was defining COMPOSER_HOME env variable. I removed such variable and my cache is now working

Peter17 October 17, 2017

I don't have any environment variable defined (in bitbucket.org/XXX/YYY/admin/addon/admin/pipelines/repository-variables).

There may be environment variable defined in the Docker image I am using however...

Massimiliano Arione October 17, 2017

Sorry if I was not clear: I was talking just about an env var defined in my docker image.

Mike Hancoski October 17, 2017

@Massimiliano Arione I have tried to use the default docker image, php:7.1.1, and php:7.1-cli.  All of them seem to have the same effect.

How did you unset the COMPOSER_HOME env variable?

Perhaps I need to us a custom docker image in order to do that?

twobyte June 22, 2021

We also are having trouble getting the Composer Cache to work, it is always empty. Tried with a custom cache directory and still no!

Cache "composer": DownloadingCache "composer": Not found

We are using image 

image: composer:2.0

How can we unset the COMPOSER_HOME env variable?

UPDATE: do not use this image. Changing to

image: pyguerder/bitbucket-pipelines-php74

fixed this for us. 

3 votes
Arjan Oskam January 15, 2018

I got the same problem, whenever I run the build, it never extacts anything and it just keeps on saying that it can not download or find cache and the cache directory remains empty.

Any solutions yet?

Arjan Oskam February 15, 2018

For me the problems were:

Node:
The node_modules folder wasn't in the root directory. Therefore the standaard cache definition did not work. The problem was solved by assigning a custom path for the node_modules cache.

Composer:
The docker image that I was using, did have a composer ENV variable in it. As stated in an answer above: "COMPOSER_HOME" Which broke the caching function of Pipelines. I removed it and created a custom image, without the cache variable.

Now it all works perfect!

Yash Lotan April 14, 2021

Can you share the minimal directory structure and the cache definition? I am having trouble setting it up

0 votes
Money Dinh May 23, 2018

Can we cache to reuse on other step, For example:

- step:
name: abc
image: node:8.11
script:
- npm install
- tar zcvf abc.tar.gz *
- step:
name: def
image: atlassian/pipelines-awscli
script:
- aws s3 sync --delete abc.tar.gz s3://def/abc.tar.gz


MarwanSdeek July 31, 2018

@Money Dinh

Did u find solution for this ?

Oskar Stark August 22, 2018

that would be interesting for me, too

 

step one composer install, step 2 phpunit, step 3 behat tests

ps September 3, 2018

I think `artifacts` option is a solution here.

`artifacts` is better when used to store files between steps. It's always populated with the latest version.

As far as I know cache is only populated when it's empty.

Money Dinh September 3, 2018

Yes, I'm ok with artifact atm.

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events