The description related to "shared pipelines" comes with the idea of "include" in YAML.
What I am trying to work out is the following.
I have a LOT of `definitions.scripts` as anchors (AWS Login, AWS Assume Role, Docker Login, Syft as just a couple of examples ... there are a LOT more!!!).
I have a LOT of `definitions.steps` as anchors which call upon the `definition.scripts` anchors.
I have many `definitions.pipelines` that define the workflow.
And then I have the actual `pipelines`. Most are just calling those in `definitions.pipelines`.
My goal is to NOT have this copied across multiple repositories, but to have a simpler setup allowing me to setup some variables within the repo's own `bitbucket-pipeline.yml` file and then include the relevant library (php, golang, docker, terraform, etc.) which has the relevant pipelines defined in it.
The way the documentation is setup for "shared pipelines" it is ONLY the `definitions.pipelines` that get exported/imported/shared, but has, seemingly, no mention of `definitions.scripts` / `definitions.steps`.
It is probably me not understanding things, so I'm REALLY sorry about this.
If the issue is that imported elements are effectively "namespaced" by the name that they were given on the import, then that's great!
I can then import docker-pipeline.yml and internally, all the anchors there are all "local", but if ever called from the importer, the appropriate "namespace" would be used.
What about nested includes?
So, for example, the AWS login is used sort of everywhere, but it is a dependency of the next layer of pipeline files (so AWS Login is used by docker and terraform).
Considering the whole point is about reusability and, hopefully/ideally, 1 file = 1 responsibility, nested includes would be a REALLY nice feature.
Again, it is probably me not understanding this all.
definitions: pipelines
So ... initially ... I thought having separate "types" of pipelines (so those dealing with docker have nothing to do with Terraform for example).
But if the point is to "hide away" the giant pipeline files and have them as a potentially versioned pipeline ... pipes would be one alternative, but that's a LOT of setup for what is just a script.
... OK ... more to consider and setup and test out.
Thank you!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.