How can I debug my Bitbucket Pipelines build locally?

I have encountered an error while running my build, and would like to debug it locally so that I can play around with environment settings and not use up build minutes trying to get the environment working.

3 answers

1 accepted

10 votes

You can debug a Bitbucket Pipelines build locally by using Docker.

First, install Docker. To set it up will vary depending on which OS you are using. See this page for install instructions: https://docs.docker.com/engine/installation/

Have you bitbucket-pipelines.yml handy, we'll need to grab a few values and commands from here. For this answer, my bitbucket-pipelines.yml file looks as follows:

# You can use a Docker image from Docker Hub or your own container
# registry for your build environment.
image: python:2.7

pipelines:
  default:
    - step:
        script: # Modify the commands below to build your repository.
          - python --version
          - python myScript.py

Make sure you have a local copy of your Bitbucket repository. If not clone it, for example:

$ cd /Users/myUserName/code
$ git clone git@bitbucket.org:myBBUserName/localDebugRepo.git

If you run into issues cloning, check out this resource: https://confluence.atlassian.com/display/BITBUCKET/Clone+a+repository

Now, assuming you have Docker installed, you can log into the Docker container using the following command:

$ docker run -it --volume=/Users/myUserName/code/localDebugRepo:/localDebugRepo --workdir="/localDebugRepo" --memory=2048m python:2.7 /bin/bash

Lets deconstruct this command to understand, at a very basic level, what you are doing here:

docker run -it
Run a Docker container with a TTY and with STDIN open
--volume=/Users/myUserName/code/localDebugRepo:/localDebugRepo

Mount the directory /Users/myUserName/code/localDebugRepo inside the container as the directory /localDebugRepo

--workdir="/localDebugRepo"

Set the directory we start in.

--memory=2048m

Optional: This runs the container with 2GB of memory. Not required for local debug, but Bitbucket Pipelines restricts your container to 2GB of memory, so some issues may be related to running out of memory.

python:2.7
The Docker image we are going to run 
/bin/bash
Starts a bash prompt.

After running:

$ docker run -it --volume=/Users/myUserName/code/localDebugRepo:/localDebugRepo --workdir="/localDebugRepo" --memory=2048m python:2.7 /bin/bash

We are now inside of the python:2.7 container in our repo directory. We can now run each individual command of our bitbucket-pipelines.yml.

$ python --version
Python 2.7.11
$ python myScript.py
Hello world!

We can configure things like normal inside our container, for example:

$ pip install scipy
Collecting scipy
  Downloading scipy-0.17.1-cp27-cp27mu-manylinux1_x86_64.whl (39.5MB)
    100% |████████████████████████████████| 39.5MB 34kB/s
Installing collected packages: scipy
Successfully installed scipy-0.17.1

Once you get things working as needed in your local Docker container. Note the changes, add them to your bitbucket-pipelines.yml file, and push them back up to Bitbucket to build in Pipelines!

If you want to try building your own Docker container, check out this answer: https://answers.atlassian.com/questions/39140980


This should definitely be in the Pipelines documentation

Just a note, since the Pipelines machines that run the docker containers do not have swap enabled, use these parameters to simulate it locally:

--memory=2048m --memory-swap=2048m

This way, if the run fails in Pipelines due to memory limit, it will fail locally too.

What if my project is dependent on some external files (i.e. Android SDK directory).  How do I make Docker Bitbucket recognize these paths?

If you're dependent on external files, you will need to download them once the Docker container starts running. This means in your bitbucket-pipelines.yml you will need to download the external files at the start of your build. 

Here's some resources that may be able to help you with this:

Alternatively, you can create a Docker image which has the external files burnt on. 

Does that help?

local-pipelines isn't being maintained anymore.  However, I wrote a new tool called convey that can run your bitbucket-pipelines.yml via

convey -l bitbucket

Support for services is in the works but going to be a little bit has I have to restructure some things to deal with the way the network is setup between services.

This woefully falls short of debugging our `bitbucket-pipelines.yml`

 

A real solution should simulate:

- all the environment variables

- the default location of cloned files

- availability of artifacts from previous steps

- ability to run through and be prompted when manual steps are encountered.

Suggest an answer

Log in or Sign up to answer
How to earn badges on the Atlassian Community

How to earn badges on the Atlassian Community

Badges are a great way to show off community activity, whether you’re a newbie or a Champion.

Learn more
Community showcase
Posted Jun 12, 2018 in Bitbucket

Do you use any Atlassian products for your personal projects?

After spinning my wheels trying to get organized enough to write a book for National Novel Writing Month (NaNoWriMo) I took my affinity for Atlassian products from my work life and decided to tr...

22,888 views 26 12
Join discussion

Atlassian User Groups

Connect with like-minded Atlassian users at free events near you!

Find a group

Connect with like-minded Atlassian users at free events near you!

Find my local user group

Unfortunately there are no AUG chapters near you at the moment.

Start an AUG

You're one step closer to meeting fellow Atlassian users at your local meet up. Learn more about AUGs

Groups near you