It's not the same without you

Join the community to find out what other Atlassian users are discussing, debating and creating.

Atlassian Community Hero Image Collage

Dynamic bitbucket-pipeline.yml for all branches

techguru101 me Jul 29, 2018

Hi Tech Team,


I am trying to setup the automated deployment through Bitbucket pipeline , but still not succeed might be my business requirement is not fulfill by the Bitbucekt pipeline.

Current Setup
1- Dev Team push the code to default branch from their local machines. and as Team lead reviews their code and updated on UAT and production server manually by running the commands on the Server CLI directly.

# hg branch

#hg pull

#hg update

Automated deployment we want:

1- we have 3 environment DEV, UAT/Staging and production.
2 - on the bases of the environments i have created 3 Release branches . DEV-Release, UAT-Release and PROD-Release respectively.
3- dev team push the code directly to the default branch dev lead will check the changes and then create a pull request form default to UAT-Release branch and after successful deployment on UAT  server the again create Pull request from default to production branch and pipeline should be executed on the pull request and then started copying the bundle.zip on AWS S3 and then to AWS EC2 instance.

Issues:

The issue i am facing is bitbucket-pipeline.yml is not same on all release branches because the branch name s difference due to that when we create a pull request for any release branch we are getting the conflict of that file .
id there any why i can use the same bitbucket-pipline.yml file for all the branches  and deployment should be happened on that particular for which pull request is created.
can we make that file dynamic for all branches with environment variables?
if the bitbucket pipeline can not fulfill my business requirement then what is other solution ?
if you guys think my business requirement is not good or justifiable just let me know on what step i have to change to achieve the final result of automated deployments

Flow:   Developer Machine push to--> Bitbucket default branch ---> Lead will review the code then pull request for any branch (UAT,PROD) --- > pipeline will be executed an push the code to S3 bucket ----> Awscodedeply ---> EC2 application server.

waiting for the prompt response.

1 answer

2 votes
Philip Hodder Atlassian Team Aug 01, 2018

 

Hello,

I've got two different ways you can configure Pipelines that should fit your use-case.

The multi-branch use-case you've explain seems to follow Git-flow. You can configure deployments across multiple branches as follows:

image: <whatever Docker build image you need>
pipelines:
branches:
UAT-Release:
- step:
name: Deploy to UAT/Staging
deployment: staging
script:
- ./run-builds
- ./run-tests
- ./run-deployment-to-uat
PROD-Release:
- step:
name: Deploy to Production
deployment: production
script:
- ./run-builds
- ./run-tests
- ./run-deployment-to-production
DEV-Release:
- step: # No deployment tag should be added for dev environments that aren't in a deployment pipeline.
name: Deploy to DEV
script:
- ./run-builds
- ./run-test
- ./run-deployment-to-dev
default: # Run this if no other rules are matched
- step:
name: Build and test
script:
- ./run-builds
- ./run-tests

You need to make sure this bitbucket-pipelines.yml configuration is on all branches. You should avoid having different bitbucket-pipelines.yml files on different branches (unless you're updating your configuration).

Alternatively, you can try a flow which only requires a single PR. And is more idiomatic for Bitbucket Pipelines. However, there are a few more features required.

image: <Whatever Docker build image you need>
pipelines:
branches:
release: # This pipeline has 3 steps.
- step:
name: Build & test
script:
- ./run-build
- ./run-tests
artifacts: # This will carry across the files you want to deploy to all subsequent steps
- ~/path/to/files/to/deploy/*
- step:
name: Deploy to UAT/Staging
deployment: staging
trigger: manual # This step needs to be started on the UI. Remove this line to have it run automatically.
script:
- ./deploy-to-uat
- step:
name: Deploy to Production
deployment: production
trigger: manual
script:
- ./deploy-to-production
default: # If anyone can deploy to DEV, then you can add it to the default pipeline.
- step:
name: Build & test
script:
- ./run-build
- ./run-tests
artifacts:
- ~/path/to/files/to/deploy
- step:
name: Deploy to DEV
trigger: manual
script:
- ./deploy-to-dev

This configuration follows a Promotion Workflow. So you merge all code onto a single branch, and then manually progressively (and you can automatically deploy by removing the `trigger: manual` option).

Either of those should satisfy your use-case. I'd recommend the second solution if you're able to do-so, however it is obviously a bit more work to configure. Otherwise the first will also suffice for your use-case.

Additional resources:

Thanks,

Phil

Suggest an answer

Log in or Sign up to answer
This widget could not be displayed.
This widget could not be displayed.
Community showcase
Published in Bitbucket Pipelines

Building a Bitbucket Pipe as a casual coder

...ipe.sh :  #!/bin/bash source "$(dirname "$0")/common.sh" enable_debug extra_args="" if [[ "${DEBUG}" == "true" ]]; then extra_args="--verbose" fi # mandatory variables R...

3,299 views 1 22
Read article

Community Events

Connect with like-minded Atlassian users at free events near you!

Find an event

Connect with like-minded Atlassian users at free events near you!

Unfortunately there are no Community Events near you at the moment.

Host an event

You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events

Events near you