Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Dynamic bitbucket-pipeline.yml for all branches

techguru101 me July 29, 2018

Hi Tech Team,


I am trying to setup the automated deployment through Bitbucket pipeline , but still not succeed might be my business requirement is not fulfill by the Bitbucekt pipeline.

Current Setup
1- Dev Team push the code to default branch from their local machines. and as Team lead reviews their code and updated on UAT and production server manually by running the commands on the Server CLI directly.

# hg branch

#hg pull

#hg update

Automated deployment we want:

1- we have 3 environment DEV, UAT/Staging and production.
2 - on the bases of the environments i have created 3 Release branches . DEV-Release, UAT-Release and PROD-Release respectively.
3- dev team push the code directly to the default branch dev lead will check the changes and then create a pull request form default to UAT-Release branch and after successful deployment on UAT  server the again create Pull request from default to production branch and pipeline should be executed on the pull request and then started copying the bundle.zip on AWS S3 and then to AWS EC2 instance.

Issues:

The issue i am facing is bitbucket-pipeline.yml is not same on all release branches because the branch name s difference due to that when we create a pull request for any release branch we are getting the conflict of that file .
id there any why i can use the same bitbucket-pipline.yml file for all the branches  and deployment should be happened on that particular for which pull request is created.
can we make that file dynamic for all branches with environment variables?
if the bitbucket pipeline can not fulfill my business requirement then what is other solution ?
if you guys think my business requirement is not good or justifiable just let me know on what step i have to change to achieve the final result of automated deployments

Flow:   Developer Machine push to--> Bitbucket default branch ---> Lead will review the code then pull request for any branch (UAT,PROD) --- > pipeline will be executed an push the code to S3 bucket ----> Awscodedeply ---> EC2 application server.

waiting for the prompt response.

1 answer

2 votes
Philip Hodder
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
August 1, 2018

 

Hello,

I've got two different ways you can configure Pipelines that should fit your use-case.

The multi-branch use-case you've explain seems to follow Git-flow. You can configure deployments across multiple branches as follows:

image: <whatever Docker build image you need>
pipelines:
branches:
UAT-Release:
- step:
name: Deploy to UAT/Staging
deployment: staging
script:
- ./run-builds
- ./run-tests
- ./run-deployment-to-uat
PROD-Release:
- step:
name: Deploy to Production
deployment: production
script:
- ./run-builds
- ./run-tests
- ./run-deployment-to-production
DEV-Release:
- step: # No deployment tag should be added for dev environments that aren't in a deployment pipeline.
name: Deploy to DEV
script:
- ./run-builds
- ./run-test
- ./run-deployment-to-dev
default: # Run this if no other rules are matched
- step:
name: Build and test
script:
- ./run-builds
- ./run-tests

You need to make sure this bitbucket-pipelines.yml configuration is on all branches. You should avoid having different bitbucket-pipelines.yml files on different branches (unless you're updating your configuration).

Alternatively, you can try a flow which only requires a single PR. And is more idiomatic for Bitbucket Pipelines. However, there are a few more features required.

image: <Whatever Docker build image you need>
pipelines:
branches:
release: # This pipeline has 3 steps.
- step:
name: Build & test
script:
- ./run-build
- ./run-tests
artifacts: # This will carry across the files you want to deploy to all subsequent steps
- ~/path/to/files/to/deploy/*
- step:
name: Deploy to UAT/Staging
deployment: staging
trigger: manual # This step needs to be started on the UI. Remove this line to have it run automatically.
script:
- ./deploy-to-uat
- step:
name: Deploy to Production
deployment: production
trigger: manual
script:
- ./deploy-to-production
default: # If anyone can deploy to DEV, then you can add it to the default pipeline.
- step:
name: Build & test
script:
- ./run-build
- ./run-tests
artifacts:
- ~/path/to/files/to/deploy
- step:
name: Deploy to DEV
trigger: manual
script:
- ./deploy-to-dev

This configuration follows a Promotion Workflow. So you merge all code onto a single branch, and then manually progressively (and you can automatically deploy by removing the `trigger: manual` option).

Either of those should satisfy your use-case. I'd recommend the second solution if you're able to do-so, however it is obviously a bit more work to configure. Otherwise the first will also suffice for your use-case.

Additional resources:

Thanks,

Phil

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events