Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

How to keep DRY with BitBucket-Pipelines ?

Illia Ananich October 11, 2016

I just want to make some parts of scripts declared in `bitbucket-pipelines.yml` available for execution many times. 

Example:

Project have `master` and `develop` branches. When it is new commit in `develop` branch I need to execute some commands and run tests. But when it is a new commit in `master` branch I need to run the same commands as in the case of `develop` branch but I need to execute another deployment scripts too.

Is it some elegant solution for this problem?

1 answer

1 accepted

Comments for this post are closed

Community moderators have prevented the ability to post new answers.

Post a new question

1 vote
Answer accepted
Sten Pittet
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
October 11, 2016

Hi @Illia Ananich,

I'd recommend moving your script in a separate file that you can call from your pipeline. In your case I think it would look like this:

# This is a sample build configuration for Javascript.
# Only use spaces to indent your .yml configuration.
# -----
# You can specify a custom docker image from Docker Hub as your build environment.
image: node:4.6.0

pipelines:
  branches:
    develop:
      - step:
          script:
            - ./run_tests.sh
    master:
      - step:
        script:
          - ./run_tests.sh
          - export VAR=$VAR # export your variables if you need to access them in your script.
          - ./deploy.sh

Let me know if that makes sense.

Illia Ananich October 29, 2016

Yes, it works. But before, you need to run `git update-index --chmod=+x <file>` command (that works on all systems with installed git) to make the script available to execute. 

dr75 July 23, 2018

Just to clarify: you run this only once locally and then commit the file to the git repo as an executable. No need to run this as part of the yml.

dr75 July 23, 2018

In general, I would like to comment that while it is possible to avoid some duplication by pulling things together into scripts, the solution is not perfect. Mainly I see two drawbacks:

1. In a more advanced scenario, one still has a certain amount of duplication violating DRY: the image, caches, name declarations.

2. But what worries me more is that now all the commands in the script are reported under one top-level node which is the script name. So we lose timing information about the individual (sub)steps. For example if one has a couple of lines: test projA; test projB; then previously I immediately could see how long the testing of projA took. If these comands are pulled together into one script, then only the total length of the script is reported in the pipelines UI.

I believe it would be nice to have some sort of "code sharing" built into pipelines. And actually it already has that with its "definitions". Could it be extended so that it would allow for defining a number of sub-steps that can be reused among the branches?

Like # people like this
TAGS
AUG Leaders

Atlassian Community Events