Missed Team ’24? Catch up on announcements here.

×
Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Pipeline with multiple scripts runs only the last one (broken)

A360 Bitbucket Admin November 28, 2019

The following script used to work perfectly fine up until a few days ago when it all of a sudden decided to only execute the most recent anchored script.

pipelines: 
custom:
my-custom-pipeline:
- step:
script:
- echo "FIRST"
script:
- echo "SECOND"

In the above example when my-custom-pipeline is run only "SECOND" is echoed. If I remove the second script then "FIRST" is echoed.

This is a fairly new issue as my pipeline of multiple scripts (anchored scripts, in fact) used to run flawlessly. Did someone put a bug in the YAML parser without documenting it? ;)

1 answer

1 accepted

0 votes
Answer accepted
Steven Vaccarella
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
November 28, 2019

Hi Mostafa,

I'm not aware of any recent change to the parsing logic that would explain this change in behaviour. However I can say that we never intended to support duplicate map keys in a deterministic way, so I'd highly recommend updating your pipeline definitions to avoid duplicate keys.

Kind Regards,
Steven Vaccarella

A360 Bitbucket Admin November 28, 2019

Thanks Steven. What about supporting multiple anchored scripts like below?

definitions: 
script: &My-Script-A
- ...
script: &My-Script-B
- ...
pipelines:
custom:
my-custom-pipeline:
- step:
script: *My-Script-A
script: *My-Script-B

The above pipeline used to run just fine, but now only the last script gets executed.

If this is non-deterministic is there a way to make the keys unique and achieve the same result? 

Steven Vaccarella
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
November 28, 2019

To my knowledge there is no way to do what you're trying to do directly in the yaml file in a robust way.

Our recommendation for reusing scripts is to move them to separate script files in your repo and then invoke the required scripts from a single "script" section in the yaml.

ie.

pipelines: 
custom:
my-custom-pipeline:
- step:
script:
- 'pipeline-scripts/my-script-a.sh'
- 'pipeline-scripts/my-script-b.sh'

 

Like # people like this
[OLD PREKARILABS] Timothy Blumberg December 16, 2019

Posting this here so that the next person can see it directly. The above didn't work for me directly, I had to use the following:

pipelines:
default:
- step:
script:
- ./pipeline-scripts/common-build-test.sh
Jake Middleton December 30, 2020

Another solution more closely resembling the initial request for anchored scripts would be the following:

definitions:
.install-gcloud-sdk:
- &install-gcloud-sdk |
wget https://dl.google.com/dl/cloudsdk/release/google-cloud-sdk.tar.gz
tar zxvf google-cloud-sdk.tar.gz && ./google-cloud-sdk/install.sh --quiet --usage-reporting=false --path-update=true
PATH="google-cloud-sdk/bin:${PATH}"
gcloud --quiet components update

pipelines:
default:
- step:
name: Do Something with the Google Cloud SDK
script:
- *install-gcloud-sdk
- ...the rest of your script

That will run all the commands in the `install-gcloud-sdk` snippet wherever you reference it.  You can define as many as you want and use them anywhere you want throughout your pipelines.

Like # people like this

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events