Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Thoughts on shared pipelines and their future

Aurimas N_
Contributor
June 25, 2024

It seems this is the only documentation regarding shared pipelines:

  1. https://support.atlassian.com/bitbucket-cloud/docs/share-pipelines-configurations/
  2. https://confluence.atlassian.com/bbkb/how-to-set-up-a-shared-bitbucket-pipeline-for-the-main-branch-1295385940.html

And they are basically the same.

Browsing the community questions I am slowly coming to a conclusion that the documentation is scarce because these articles basically cover everything the shared pipelines can do, which is very limited indeed.

- no way to reference anchors from imported pipeline

- no way to mix and match (merge) stages/steps from imported pipeline with your local pipeline segments

- no way to parameterize exporting pipeline

- no way to add scripts (or any other dependencies) to the exporting pipeline so they are available for importing repos


Most of the questions regarding shared pipelines have dynamic pipelines offered as a solution.
Is there a future for shared pipelines or is this the final form and all the improvements and new features to be released for dynamic pipelines? In other words is it worth bothering with shared pipelines?

1 answer

1 accepted

0 votes
Answer accepted
Edmund Munday
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
July 7, 2024

Hi @Aurimas N_ - thanks for the question, you raise a bunch of really excellent points.

To put it plainly, we're very aware of the gaps that exist right now in terms of Pipeline composability and reusability, and it's going to be one of the top focus-areas for us over the next 12 months.

---

Most of the questions regarding shared pipelines have dynamic pipelines offered as a solution.

Yes, you are 100% correct, but there is some method to the madness.

Dynamic Pipelines were effectively built to be your "big red button" or "escape hatch" when it comes to Pipeline composition & reuse, so that you at least have the option of never being constrained by the off-the-shelf native features that are available at a particular point in time. They are a little bit more involved than a purely native solution (although a LOT less involved than most people think on first impressions), but they are almost infinitely flexible and allow you to do... close to anything you can possibly think of.

The logic here is that we knew we had a lot of ground to cover, and something like Dynamic Pipelines allowed us to provide a single "imperfect solution for every problem" in the immediate term while we build out a set of more targeted, "single-problem" focussed capabilities that will fill the gaps between .yaml imports at one end of the scale, and Dynamic Pipelines at the other end of the scale.

---

The next major change coming in this space is going to be what we're internally referring to as "Pipelines of Pipelines", or "Parent/Child Pipelines" depending on how you look at it. This will basically be an evolution of the existing .yaml import functionality, designed to allow you to break your Pipelines up into smaller reusable modules, then import them in chunks to "compose" together larger more complex Pipeline workflows.

For example, you might have a standardised "Node.js Testing" workflow that is defined in a .yaml file in a repo somewhere. That workflow may be a single step, or it may be multiple steps, it's totally up to you.

The plan is for you to be able to import that Workflow into another Pipeline and execute it as if it was a single-step. This will also include a bunch of other changes like support for multiple .yaml files in a given repo, as well as other composition capabilities.

---

Now, just to loop back to what I was saying before, this is a great example of something you can actually already achieve using Dynamic Pipelines plus something like Labels in your Pipelines .yaml file, but obviously it's a bit more involved than something simple like:

step:
import: pipelines-templates:main:node-testing

which is a rough approximation of what we're ultimately aiming for. 

---

Would love to hear your thoughts, understand some more details on your use-cases, and get your feedback.

I'd also love to know if what I'm saying re: "this already being possible with Dynamic Pipelines" makes sense. e.g are we doing a good enough job of showing people what's possible, do we need to publish more examples, etc.

Aurimas N_
Contributor
July 25, 2024

Great to hear about these plans, mentioned capabilities can't come soon enough.

Regarding dynamic pipelines I try to avoid them if I can, mostly because it adds another layer of complexity and also they don't always solve my problem, for example no matter how dynamic it is in the end I still can't have two stages in a pipeline with same deployment environment (can I ?).

Sometimes it just feels like a poor workaround for a given problem, for example not being to specify runs-on for whole stage or pipeline, I could dynamically add runs-on for every step, but it really does not feel like a great solution.

And then if you try to achieve everything via dynamic pipelines it feels a bit counter productive, as if I would be writing my own tooling for capabilities that are generally expected in a ci/cd product.

 

As for the use cases they are pretty much what you mention, having a step, or a stage imported from "shared" pipeline, and also being able to parametrize it. For example I might have "docker build" step I share across repositories, but I want to be able to provide different arguments for `docker build` command for that step in each repo I use it.

Like # people like this

Suggest an answer

Log in or Sign up to answer
DEPLOYMENT TYPE
CLOUD
PERMISSIONS LEVEL
Product Admin
TAGS
AUG Leaders

Atlassian Community Events