The following script used to work perfectly fine up until a few days ago when it all of a sudden decided to only execute the most recent anchored script.
pipelines:
custom:
my-custom-pipeline:
- step:
script:
- echo "FIRST"
script:
- echo "SECOND"
In the above example when my-custom-pipeline is run only "SECOND" is echoed. If I remove the second script then "FIRST" is echoed.
This is a fairly new issue as my pipeline of multiple scripts (anchored scripts, in fact) used to run flawlessly. Did someone put a bug in the YAML parser without documenting it? ;)
Hi Mostafa,
I'm not aware of any recent change to the parsing logic that would explain this change in behaviour. However I can say that we never intended to support duplicate map keys in a deterministic way, so I'd highly recommend updating your pipeline definitions to avoid duplicate keys.
Kind Regards,
Steven Vaccarella
Thanks Steven. What about supporting multiple anchored scripts like below?
definitions:
script: &My-Script-A
- ...
script: &My-Script-B
- ...
pipelines:
custom:
my-custom-pipeline:
- step:
script: *My-Script-A
script: *My-Script-B
The above pipeline used to run just fine, but now only the last script gets executed.
If this is non-deterministic is there a way to make the keys unique and achieve the same result?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
To my knowledge there is no way to do what you're trying to do directly in the yaml file in a robust way.
Our recommendation for reusing scripts is to move them to separate script files in your repo and then invoke the required scripts from a single "script" section in the yaml.
ie.
pipelines:
custom:
my-custom-pipeline:
- step:
script:
- 'pipeline-scripts/my-script-a.sh'
- 'pipeline-scripts/my-script-b.sh'
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Posting this here so that the next person can see it directly. The above didn't work for me directly, I had to use the following:
pipelines:
default:
- step:
script:
- ./pipeline-scripts/common-build-test.sh
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Another solution more closely resembling the initial request for anchored scripts would be the following:
definitions:
.install-gcloud-sdk:
- &install-gcloud-sdk |
wget https://dl.google.com/dl/cloudsdk/release/google-cloud-sdk.tar.gz
tar zxvf google-cloud-sdk.tar.gz && ./google-cloud-sdk/install.sh --quiet --usage-reporting=false --path-update=true
PATH="google-cloud-sdk/bin:${PATH}"
gcloud --quiet components update
pipelines:
default:
- step:
name: Do Something with the Google Cloud SDK
script:
- *install-gcloud-sdk
- ...the rest of your script
That will run all the commands in the `install-gcloud-sdk` snippet wherever you reference it. You can define as many as you want and use them anywhere you want throughout your pipelines.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.