Forums

Articles
Create
cancel
Showing results for 
Search instead for 
Did you mean: 

What can and cannot be imported using the concept of "shared pipelines"?

Richard Quadling
Contributor
April 13, 2026

The description related to "shared pipelines" comes with the idea of "include" in YAML.

What I am trying to work out is the following.

I have a LOT of `definitions.scripts` as anchors (AWS Login, AWS Assume Role, Docker Login, Syft as just a couple of examples ... there are a LOT more!!!).

I have a LOT of `definitions.steps` as anchors which call upon the `definition.scripts` anchors.

I have many `definitions.pipelines` that define the workflow.

And then I have the actual `pipelines`. Most are just calling those in `definitions.pipelines`.

My goal is to NOT have this copied across multiple repositories, but to have a simpler setup allowing me to setup some variables within the repo's own `bitbucket-pipeline.yml` file and then include the relevant library (php, golang, docker, terraform, etc.) which has the relevant pipelines defined in it.

The way the documentation is setup for "shared pipelines" it is ONLY the `definitions.pipelines` that get exported/imported/shared, but has, seemingly, no mention of `definitions.scripts` / `definitions.steps`.


It is probably me not understanding things, so I'm REALLY sorry about this.

If the issue is that imported elements are effectively "namespaced" by the name that they were given on the import, then that's great!

I can then import docker-pipeline.yml and internally, all the anchors there are all "local", but if ever called from the importer, the appropriate "namespace" would be used.

What about nested includes?

So, for example, the AWS login is used sort of everywhere, but it is a dependency of the next layer of pipeline files (so AWS Login is used by docker and terraform).

Considering the whole point is about reusability and, hopefully/ideally, 1 file = 1 responsibility, nested includes would be a REALLY nice feature.

Again, it is probably me not understanding this all.

1 answer

0 votes
Ajay _view26_
Community Champion
April 13, 2026

Hi @Richard Quadling 

 

For the structure you described, the usual pattern would be:
1. Put reusable end-to-end pipeline definitions in the shared file under
definitions: pipelines
2. Keep script fragments as YAML anchors or wrap them inside reusable steps/pipelines
3. Import the pipeline you want in each repo, and feed repo-specific values through variables/secrets

Richard Quadling
Contributor
April 14, 2026

So ... initially ... I thought having separate "types" of pipelines (so those dealing with docker have nothing to do with Terraform for example).

But if the point is to "hide away" the giant pipeline files and have them as a potentially versioned pipeline ... pipes would be one alternative, but that's a LOT of setup for what is just a script.

... OK ... more to consider and setup and test out.

Thank you!

Suggest an answer

Log in or Sign up to answer
DEPLOYMENT TYPE
CLOUD
PRODUCT PLAN
PREMIUM
TAGS
AUG Leaders

Atlassian Community Events