You're on your way to the next level! Join the Kudos program to earn points and save your progress.
Level 1: Seed
25 / 150 points
Next: Root
1 badge earned
Challenges come and go, but your rewards stay with you. Do more to earn more!
What goes around comes around! Share the love by gifting kudos to your peers.
Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!
Join now to unlock these features and more
The Atlassian Community can help you and your team get more value out of Atlassian products and practices.
I've got 200+ repositories which should be built and tested. These repo's do not contain software but configuration files which also have version management.
The software to build and test them is in a separate repository because its the same for all 200 repo's.
I really don't want to create 200 identical bitbucket-pipeline.yml files because thats not DRY.
I want to be able to run one pipeline but then for each of the 200 repo's. So if one repo changes I want to trigger the one pipeline but I guess I need to be able to tell it what repository has changed.
Is this possible?
Hi @Hans Pikkemaat,
We have a feature request tracking this here and the work on the spec has started.
I recommend following that and giving any feedback there if you wish!
Cheers,
Davina
Hi,
I think that is a good way te reuse components but I'm not sure if this will prevent creating 200 bitbucket-pipelines.yaml files in my use case.
I need one pipeline job which I can trigger when one of the 200 repo's change and somehow pass the job env vars containing the repo that was changed.
The job will then know the repo that changed, clone it, and run the tests on it.
If you have advise on how to do this I would be very happy to hear about it.
kind regards,
Hans
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi @davina ,
I know it has been a while since this post was created, but by any chance, is there a feature in the works that would prevent us from creating 200 bitbucket-pipelines.yml identical files for each repo like @Hans Pikkemaat mentioned above (i.e one pipeline to handle multiple repositories)?
Regards,
Abdul
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Not sure this would work for you, but you can create a custom pipeline.
You would create one repo for the custom pipeline that would contain all logic and then the other repos would just include a single step, the custom pipelne.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hello @Swav Swiac
Say a custom pipeline is sufficient. How would one include the custom pipeline from repo across multiple other repos?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
AFAIK, the custom pipeline is just a public Docker Hub image that you could create. I think they are also introducting other options for source of pipelines.
Example usage for Nancy Check pipeline from one of our repos below:
- step:
name: Nancy check
artifacts:
download: false
script:
- pipe: sonatype-community/nancy-scan:0.1.23
variables:
NANCY_MODFILE: go.sum
If you have some secret logic in your build, you could use private docker image as build image to encapsulate it all:
https://support.atlassian.com/bitbucket-cloud/docs/use-docker-images-as-build-environments/
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.