I've got 200+ repositories which should be built and tested. These repo's do not contain software but configuration files which also have version management.
The software to build and test them is in a separate repository because its the same for all 200 repo's.
I really don't want to create 200 identical bitbucket-pipeline.yml files because thats not DRY.
I want to be able to run one pipeline but then for each of the 200 repo's. So if one repo changes I want to trigger the one pipeline but I guess I need to be able to tell it what repository has changed.
Is this possible?
If someone comes here via internet research like me - if you are on premium plan, shared pipelines are it's a supported feature now, see https://support.atlassian.com/bitbucket-cloud/docs/share-pipelines-configurations/
Hi @Hans Pikkemaat,
We have a feature request tracking this here and the work on the spec has started.
I recommend following that and giving any feedback there if you wish!
Cheers,
Davina
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi,
I think that is a good way te reuse components but I'm not sure if this will prevent creating 200 bitbucket-pipelines.yaml files in my use case.
I need one pipeline job which I can trigger when one of the 200 repo's change and somehow pass the job env vars containing the repo that was changed.
The job will then know the repo that changed, clone it, and run the tests on it.
If you have advise on how to do this I would be very happy to hear about it.
kind regards,
Hans
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi @davina ,
I know it has been a while since this post was created, but by any chance, is there a feature in the works that would prevent us from creating 200 bitbucket-pipelines.yml identical files for each repo like @Hans Pikkemaat mentioned above (i.e one pipeline to handle multiple repositories)?
Regards,
Abdul
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Not sure this would work for you, but you can create a custom pipeline.
You would create one repo for the custom pipeline that would contain all logic and then the other repos would just include a single step, the custom pipelne.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hello @Swav Swiac
Say a custom pipeline is sufficient. How would one include the custom pipeline from repo across multiple other repos?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
AFAIK, the custom pipeline is just a public Docker Hub image that you could create. I think they are also introducting other options for source of pipelines.
Example usage for Nancy Check pipeline from one of our repos below:
- step:
name: Nancy check
artifacts:
download: false
script:
- pipe: sonatype-community/nancy-scan:0.1.23
variables:
NANCY_MODFILE: go.sum
If you have some secret logic in your build, you could use private docker image as build image to encapsulate it all:
https://support.atlassian.com/bitbucket-cloud/docs/use-docker-images-as-build-environments/
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thanks for the help. This seems to be what I was looking for.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.