Hi, I'm having trouble reusing artifacts.
Basically, my bitbuckets-pipelines looks similar to this:
pipelines:
custom:
build:
- step:
name: Build #this is the step which builds the artifact
... blabla
artifacts:
- dist/**
deploy:
- step:
name: Deploy
deployment: production
script:
- ls -al dist #this is where the artifact should get used
Now, obviously, that ls command is just to test things out, but it errors out saying dist is not found. Also if i ls -al i just get my regular repo structure, with no dist in sight.
Is there a way to achieve this in separate pipelines, or would i have to make the deploy part of the first pipeline to effectively use artifacts?
If I correctly understand, here you are defining two pipelines ("build" and "deploy") with one step in each. And sharing artifacts through the file system is possible only between steps in the same pipeline.
(You can test if moving the two steps to the same pipeline works?)
It is still possible to share an artifact between two pipelines. You could for example create a Bitbucket download from the "build" step's result, and access that download from "deploy". You can periodically clean up downloads that are older than N days. (We do something similar in our rather large integration test suite.)
Thank you @Aron Gombas [Midori]
will try this. Could you maybe share a snippet from your yml, just so i don't waste another half a day figuring out all the required parameters?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
We post the file to the standard Bitbucket Cloud REST API with all variables and secrets managed in Bitbucket repo variables.
It looks like this in our yml:
- curl -X POST --user "${BB_AUTH_STRING}" "https://api.bitbucket.org/2.0/repositories/${BITBUCKET_REPO_OWNER}/${BITBUCKET_REPO_SLUG}/downloads" --form files=@"$BITBUCKET_CLONE_DIR/${BITBUCKET_BUILD_NUMBER}-results.zip"
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Ok, cool. And how does the download part work?
Btw, is BITBUCKET_REPO_OWNER a global variable? I couldn't find it on:: https://support.atlassian.com/bitbucket-cloud/docs/variables-and-secrets/
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Download: we don't use that, we only need to download things manually time to time. But I would guess it is just a GET request to the same end-point.
BITBUCKET_REPO_OWNER: I can't recall, but now I checked and it is not listed among our variables, so it should be something populated by default. Why don't you "echo" it and see if that works?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
In addition to @Aron Gombas [Midori] answer, you can also upload your artifacts to standard repos, for example, docker hub for docker, and then you can use standard commands to pull/install your artifacts: "docker pull", "npm install", "pip install", etc'.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
thank you for the addition. I think i will go with the Bitbucket Downloads solution, although at this point all options are still on the table.
Cheers!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.