Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

is there a way to have the build and deployment run separately in bitbucket pipeline?

Kevin Chu August 6, 2020

Hello all, 

I am quite new bitbucket pipe lines and while exploring it came across a few questions.

The bitbucket_pipeline.yaml file that I create at the moment does the build, push image and deploys the image in to our GCP account. that is all done in one configure file.

Is there a way for me to segregate the build and deployment? I mean a yaml config that will only do the build and push while another yaml to run the deployment? Plus I can see that we need to keep the build yaml file in the repo with the code, Can we have the pipeline yaml files in a different repo? As I do not want anyone who can pull the code down to their PC or server know how the build and deployment process works.

1 answer

0 votes
ktomk
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
August 10, 2020

There are many options actually and it is merely dependent on how you configure it. In general to get things running it's fine to have it all in one at first, but as you ask about, it then it is often pretty fast on how to separate.

As every project is different there are no "just this and works" kind of answers, however what is most often sane to suggest is to separate packaging from deploying. And depending about which kind of packaging this is about I normally also suggest to to build first, then package and then at last do the deployment (whatever that is).

So how would this work in/with Bitbucket Pipelines?

For example, if you have a build step, the outcome of the build can be an artifact. The artifact is taken over to the next step. See Use artifacts in steps (Bitbucket Support).

For packaging it's similar. You can then make the package an artifact. If you package with docker containers directly (e.g. building a docker image) it's normally running the docker build and pushing the (tagged) result to the docker repository you're using (most cloud providers have one, I'm not entirely fluent with GCP, but I'm pretty sure they have).

Then deployment is the last step. As you have the container tagged already, this is normally straight forward.

However, don't get me wrong. If this perfectly works out for you to do all in one step, there is no technical requirement to split this up.  When does splitting up make most sense then?

If your build is deterministic on the revision and you break things up into smaller steps, you can cache all these intermediates and get a benefit from that.

Normally from my perspective, having the build of the application first and separated normally give the greatest benefit. Having revision packages then that can be further processed with container-building and the rest of the deployment makes building, packaging and deployments more handsome and flexible.

Decide yourself whether or not you need such kind of flexibility. Address one issue after the other in your build pipeline, start with the one that bothers you most. Divide and conquer instead of just asking what the most awesome latest, greatest whatever there is. Instead stay in control and do things as simple/stupid as possible.

Kevin Chu August 10, 2020

Hello Ktomk,

 

Talking about the starting point, I have already managed to do that and adding steps to the buildpipeline is not an issue here.

 

I want to separate the build and deploy pipeline from the code. What I mean is there is a repo for code and another repo where the build/deploy pipeline configs are. When there is a pull request on the code repo it triggers the build and deploy repo to run the pipeline

ktomk
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
August 11, 2020

Hello @Kevin Chu

this still sounds like artifacts, but you need a way to share them so that the one repository can upload them and the other repository can download them.

If the revision is part of the object scheme of these artifacts, it is easily possible to trigger the deployment pipeline, provide the revision (you can pass variables with the trigger) and have the deployment pipeline fetch it.

Would that fulfill your needs or at least give some inspiration?

ktomk
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
September 8, 2020

If you want to do this with multiple repositories, you may be interested in the trigger-pipeline pipe.

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events