Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in
Deleted user
0 / 0 points
Next:
badges earned

Your Points Tracker
Challenges
Leaderboard
  • Global
  • Feed

Badge for your thoughts?

You're enrolled in our new beta rewards program. Join our group to get the inside scoop and share your feedback.

Join group
Recognition
Give the gift of kudos
You have 0 kudos available to give
Who do you want to recognize?
Why do you want to recognize them?
Kudos
Great job appreciating your peers!
Check back soon to give more kudos.

Past Kudos Given
No kudos given
You haven't given any kudos yet. Share the love above and you'll see it here.

It's not the same without you

Join the community to find out what other Atlassian users are discussing, debating and creating.

Atlassian Community Hero Image Collage

Trying to understand pipelines caching

I'd really like to use pipelines caching, but I must admit to being confused as to how it works. I have a project I'm testing it on using the following pipelines script. The project uses both composer and node and has both composer.json and package.json files.

 

pipelines:
branches:
feature/cache-test:
- step:
caches:
- composer
script:
- composer install --no-dev
- composer run-script build
- zip -FSr ${BITBUCKET_REPO_SLUG}-cache-test.zip ./ -x@exclude.lst
- curl -u ${BB_AUTH_STRING} -X POST "https://api.bitbucket.org/2.0/repositories/${BITBUCKET_REPO_OWNER}/${BITBUCKET_REPO_SLUG}/downloads" --form files=@"${BITBUCKET_REPO_SLUG}-cache-test.zip"

The composer `build` script runs yarn install and then gulp. However, when I run this project through pipelines, it reports back that `Cache "composer": Not found`. What am I doing wrong here? Or am I simply not understanding how this is supposed to work?

FWIW I also tried caching node and that didn't work either.

3 answers

1 accepted

3 votes
Answer accepted
devpartisan Atlassian Team Sep 01, 2017

That's pretty much the way it should work. The first time you build with a new cache setting, it's going to say "not found". When you rerun the build, you should see the following in build setup:

Cache "node": Downloading
Cache "node": Extracting
Cache "node": Loaded

Then the cache has been primed for subsequent builds. Remember, the cache is dropped after 1 week.

I don't have a handy PHP project for testing so please do let us know if there's something wrong with the composer one.

I'm following the docs, but my composer cache is always empty. Do I need to create `~/.composer/cache` directory myself?

I have the same issue the composer dir is always empty

I never seem to get "extracting" or "loaded"

I rerun the pipeline and I get 

Cache "node": DownloadingCache "node": Not found

or 

Cache "composer": DownloadingCache "composer": Not found

It never seems to load them, is there a step missing here?

please do let us know if there's something wrong with the composer one

Yes, there is something wrong: it does not work, as reported just above.

Best regards.

In my case, the problem was defining COMPOSER_HOME env variable. I removed such variable and my cache is now working

I don't have any environment variable defined (in bitbucket.org/XXX/YYY/admin/addon/admin/pipelines/repository-variables).

There may be environment variable defined in the Docker image I am using however...

Sorry if I was not clear: I was talking just about an env var defined in my docker image.

@Massimiliano Arione I have tried to use the default docker image, php:7.1.1, and php:7.1-cli.  All of them seem to have the same effect.

How did you unset the COMPOSER_HOME env variable?

Perhaps I need to us a custom docker image in order to do that?

I got the same problem, whenever I run the build, it never extacts anything and it just keeps on saying that it can not download or find cache and the cache directory remains empty.

Any solutions yet?

For me the problems were:

Node:
The node_modules folder wasn't in the root directory. Therefore the standaard cache definition did not work. The problem was solved by assigning a custom path for the node_modules cache.

Composer:
The docker image that I was using, did have a composer ENV variable in it. As stated in an answer above: "COMPOSER_HOME" Which broke the caching function of Pipelines. I removed it and created a custom image, without the cache variable.

Now it all works perfect!

Can you share the minimal directory structure and the cache definition? I am having trouble setting it up

Can we cache to reuse on other step, For example:

- step:
name: abc
image: node:8.11
script:
- npm install
- tar zcvf abc.tar.gz *
- step:
name: def
image: atlassian/pipelines-awscli
script:
- aws s3 sync --delete abc.tar.gz s3://def/abc.tar.gz


@Money Dinh

Did u find solution for this ?

that would be interesting for me, too

 

step one composer install, step 2 phpunit, step 3 behat tests

I think `artifacts` option is a solution here.

`artifacts` is better when used to store files between steps. It's always populated with the latest version.

As far as I know cache is only populated when it's empty.

Yes, I'm ok with artifact atm.

Suggest an answer

Log in or Sign up to answer
TAGS
Community showcase
Published in Bitbucket

Calling any interview participants for Bitbucket Data Center

Hi everyone,  We are looking to learn more about development teams’ workflows and pain points, especially around DevOps, integrations, administration, scale, security, and the related challeng...

644 views 7 4
Read article

Community Events

Connect with like-minded Atlassian users at free events near you!

Find an event

Connect with like-minded Atlassian users at free events near you!

Unfortunately there are no Community Events near you at the moment.

Host an event

You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events

Events near you