How can I debug my Bitbucket Pipelines build locally?

I have encountered an error while running my build, and would like to debug it locally so that I can play around with environment settings and not use up build minutes trying to get the environment working.

2 answers

1 accepted

10 votes

You can debug a Bitbucket Pipelines build locally by using Docker.

First, install Docker. To set it up will vary depending on which OS you are using. See this page for install instructions:

Have you bitbucket-pipelines.yml handy, we'll need to grab a few values and commands from here. For this answer, my bitbucket-pipelines.yml file looks as follows:

# You can use a Docker image from Docker Hub or your own container
# registry for your build environment.
image: python:2.7

    - step:
        script: # Modify the commands below to build your repository.
          - python --version
          - python

Make sure you have a local copy of your Bitbucket repository. If not clone it, for example:

$ cd /Users/myUserName/code
$ git clone

If you run into issues cloning, check out this resource:

Now, assuming you have Docker installed, you can log into the Docker container using the following command:

$ docker run -it --volume=/Users/myUserName/code/localDebugRepo:/localDebugRepo --workdir="/localDebugRepo" --memory=2048m python:2.7 /bin/bash

Lets deconstruct this command to understand, at a very basic level, what you are doing here:

docker run -it
Run a Docker container with a TTY and with STDIN open

Mount the directory /Users/myUserName/code/localDebugRepo inside the container as the directory /localDebugRepo


Set the directory we start in.


Optional: This runs the container with 2GB of memory. Not required for local debug, but Bitbucket Pipelines restricts your container to 2GB of memory, so some issues may be related to running out of memory.

The Docker image we are going to run 
Starts a bash prompt.

After running:

$ docker run -it --volume=/Users/myUserName/code/localDebugRepo:/localDebugRepo --workdir="/localDebugRepo" --memory=2048m python:2.7 /bin/bash

We are now inside of the python:2.7 container in our repo directory. We can now run each individual command of our bitbucket-pipelines.yml.

$ python --version
Python 2.7.11
$ python
Hello world!

We can configure things like normal inside our container, for example:

$ pip install scipy
Collecting scipy
  Downloading scipy-0.17.1-cp27-cp27mu-manylinux1_x86_64.whl (39.5MB)
    100% |████████████████████████████████| 39.5MB 34kB/s
Installing collected packages: scipy
Successfully installed scipy-0.17.1

Once you get things working as needed in your local Docker container. Note the changes, add them to your bitbucket-pipelines.yml file, and push them back up to Bitbucket to build in Pipelines!

If you want to try building your own Docker container, check out this answer:

This should definitely be in the Pipelines documentation

Just a note, since the Pipelines machines that run the docker containers do not have swap enabled, use these parameters to simulate it locally:

--memory=2048m --memory-swap=2048m

This way, if the run fails in Pipelines due to memory limit, it will fail locally too.

What if my project is dependent on some external files (i.e. Android SDK directory).  How do I make Docker Bitbucket recognize these paths?

If you're dependent on external files, you will need to download them once the Docker container starts running. This means in your bitbucket-pipelines.yml you will need to download the external files at the start of your build. 

Here's some resources that may be able to help you with this:

Alternatively, you can create a Docker image which has the external files burnt on. 

Does that help?

local-pipelines isn't being maintained anymore.  However, I wrote a new tool called convey that can run your bitbucket-pipelines.yml via

convey -l bitbucket

Support for services is in the works but going to be a little bit has I have to restructure some things to deal with the way the network is setup between services.

Suggest an answer

Log in or Join to answer
Community showcase
Piotr Plewa
Published Dec 27, 2017 in Bitbucket

Recipe: Deploying AWS Lambda functions with Bitbucket Pipelines

Bitbucket Pipelines helps me manage and automate a number of serverless deployments to AWS Lambda and this is how I do it. I'm building Node.js Lambda functions using node-lambda&nbsp...

706 views 0 4
Read article

Atlassian User Groups

Connect with like-minded Atlassian users at free events near you!

Find a group

Connect with like-minded Atlassian users at free events near you!

Find my local user group

Unfortunately there are no AUG chapters near you at the moment.

Start an AUG

You're one step closer to meeting fellow Atlassian users at your local meet up. Learn more about AUGs

Groups near you
Atlassian Team Tour

Join us on the Team Tour

We're bringing product updates and pro tips on teamwork to ten cities around the world.

Save your spot