Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

It's not the same without you

Join the community to find out what other Atlassian users are discussing, debating and creating.

Atlassian Community Hero Image Collage

Pipeline with multiple scripts runs only the last one (broken)

Edited

The following script used to work perfectly fine up until a few days ago when it all of a sudden decided to only execute the most recent anchored script.

pipelines: 
custom:
my-custom-pipeline:
- step:
script:
- echo "FIRST"
script:
- echo "SECOND"

In the above example when my-custom-pipeline is run only "SECOND" is echoed. If I remove the second script then "FIRST" is echoed.

This is a fairly new issue as my pipeline of multiple scripts (anchored scripts, in fact) used to run flawlessly. Did someone put a bug in the YAML parser without documenting it? ;)

1 answer

1 accepted

0 votes
Answer accepted

Hi Mostafa,

I'm not aware of any recent change to the parsing logic that would explain this change in behaviour. However I can say that we never intended to support duplicate map keys in a deterministic way, so I'd highly recommend updating your pipeline definitions to avoid duplicate keys.

Kind Regards,
Steven Vaccarella

Thanks Steven. What about supporting multiple anchored scripts like below?

definitions: 
script: &My-Script-A
- ...
script: &My-Script-B
- ...
pipelines:
custom:
my-custom-pipeline:
- step:
script: *My-Script-A
script: *My-Script-B

The above pipeline used to run just fine, but now only the last script gets executed.

If this is non-deterministic is there a way to make the keys unique and achieve the same result? 

To my knowledge there is no way to do what you're trying to do directly in the yaml file in a robust way.

Our recommendation for reusing scripts is to move them to separate script files in your repo and then invoke the required scripts from a single "script" section in the yaml.

ie.

pipelines: 
custom:
my-custom-pipeline:
- step:
script:
- 'pipeline-scripts/my-script-a.sh'
- 'pipeline-scripts/my-script-b.sh'

 

Like # people like this

Posting this here so that the next person can see it directly. The above didn't work for me directly, I had to use the following:

pipelines:
default:
- step:
script:
- ./pipeline-scripts/common-build-test.sh

Another solution more closely resembling the initial request for anchored scripts would be the following:

definitions:
.install-gcloud-sdk:
- &install-gcloud-sdk |
wget https://dl.google.com/dl/cloudsdk/release/google-cloud-sdk.tar.gz
tar zxvf google-cloud-sdk.tar.gz && ./google-cloud-sdk/install.sh --quiet --usage-reporting=false --path-update=true
PATH="google-cloud-sdk/bin:${PATH}"
gcloud --quiet components update

pipelines:
default:
- step:
name: Do Something with the Google Cloud SDK
script:
- *install-gcloud-sdk
- ...the rest of your script

That will run all the commands in the `install-gcloud-sdk` snippet wherever you reference it.  You can define as many as you want and use them anywhere you want throughout your pipelines.

Like Stetson Lewis likes this

Suggest an answer

Log in or Sign up to answer
TAGS
Community showcase
Published in Bitbucket

New improvements to user management in Bitbucket Cloud 👥

Hey Community! We’re willing to wager that quite a few of you not only use Bitbucket, but administer it too. Our team is excited to share that we’ll be releasing improvements throughout this month of...

3,757 views 10 16
Read article

Community Events

Connect with like-minded Atlassian users at free events near you!

Find an event

Connect with like-minded Atlassian users at free events near you!

Unfortunately there are no Community Events near you at the moment.

Host an event

You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events

Events near you