Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

It's not the same without you

Join the community to find out what other Atlassian users are discussing, debating and creating.

Atlassian Community Hero Image Collage

How can I debug my Bitbucket Pipelines build locally?

I have encountered an error while running my build, and would like to debug it locally so that I can play around with environment settings and not use up build minutes trying to get the environment working.

6 answers

1 accepted

14 votes
Answer accepted

You can debug a Bitbucket Pipelines build locally by using Docker.

First, install Docker. To set it up will vary depending on which OS you are using. See this page for install instructions:

Have you bitbucket-pipelines.yml handy, we'll need to grab a few values and commands from here. For this answer, my bitbucket-pipelines.yml file looks as follows:

# You can use a Docker image from Docker Hub or your own container
# registry for your build environment.
image: python:2.7

    - step:
        script: # Modify the commands below to build your repository.
          - python --version
          - python

Make sure you have a local copy of your Bitbucket repository. If not clone it, for example:

$ cd /Users/myUserName/code
$ git clone

If you run into issues cloning, check out this resource:

Now, assuming you have Docker installed, you can log into the Docker container using the following command:

$ docker run -it --volume=/Users/myUserName/code/localDebugRepo:/localDebugRepo --workdir="/localDebugRepo" --memory=2048m python:2.7 /bin/bash

Lets deconstruct this command to understand, at a very basic level, what you are doing here:

docker run -it
Run a Docker container with a TTY and with STDIN open

Mount the directory /Users/myUserName/code/localDebugRepo inside the container as the directory /localDebugRepo


Set the directory we start in.


Optional: This runs the container with 2GB of memory. Not required for local debug, but Bitbucket Pipelines restricts your container to 2GB of memory, so some issues may be related to running out of memory.

The Docker image we are going to run 
Starts a bash prompt.

After running:

$ docker run -it --volume=/Users/myUserName/code/localDebugRepo:/localDebugRepo --workdir="/localDebugRepo" --memory=2048m python:2.7 /bin/bash

We are now inside of the python:2.7 container in our repo directory. We can now run each individual command of our bitbucket-pipelines.yml.

$ python --version
Python 2.7.11
$ python
Hello world!

We can configure things like normal inside our container, for example:

$ pip install scipy
Collecting scipy
  Downloading scipy-0.17.1-cp27-cp27mu-manylinux1_x86_64.whl (39.5MB)
    100% |████████████████████████████████| 39.5MB 34kB/s
Installing collected packages: scipy
Successfully installed scipy-0.17.1

Once you get things working as needed in your local Docker container. Note the changes, add them to your bitbucket-pipelines.yml file, and push them back up to Bitbucket to build in Pipelines!

If you want to try building your own Docker container, check out this answer:

This should definitely be in the Pipelines documentation

Like # people like this

Just a note, since the Pipelines machines that run the docker containers do not have swap enabled, use these parameters to simulate it locally:

--memory=2048m --memory-swap=2048m

This way, if the run fails in Pipelines due to memory limit, it will fail locally too.

What if my project is dependent on some external files (i.e. Android SDK directory).  How do I make Docker Bitbucket recognize these paths?

If you're dependent on external files, you will need to download them once the Docker container starts running. This means in your bitbucket-pipelines.yml you will need to download the external files at the start of your build. 

Here's some resources that may be able to help you with this:

Alternatively, you can create a Docker image which has the external files burnt on. 

Does that help?

What if I have a dependency on the database and I want to debug that thing?

This woefully falls short of debugging our `bitbucket-pipelines.yml`


A real solution should simulate:

- all the environment variables

- the default location of cloned files

- availability of artifacts from previous steps

- ability to run through and be prompted when manual steps are encountered.

This answer lacks elements:

1. How to handle the "docker" abilities of Bitbucket (ie. you can invoke docker from node:10 while in bitbucket but not while in local)
2. Memory definitions per dependency (definitions section at the end)

In case you have a special entry point defined in your dockerfile (e.g. a startup script) you might also overwrite the entrypoint or you might not be able to run the bash command. Futhermore the --rm comes in handy (removes the container after you ran it) in case you do not want to reuse the created container afterwards...

docker-run -it --rm --entrypoint "" .......

local-pipelines isn't being maintained anymore.  However, I wrote a new tool called convey that can run your bitbucket-pipelines.yml via

convey -l bitbucket

Support for services is in the works but going to be a little bit has I have to restructure some things to deal with the way the network is setup between services.

Suggest an answer

Log in or Sign up to answer
Community showcase
Published in Bitbucket

New improvements to user management in Bitbucket Cloud 👥

Hey Community! We’re willing to wager that quite a few of you not only use Bitbucket, but administer it too. Our team is excited to share that we’ll be releasing improvements throughout this month of...

3,713 views 10 16
Read article

Community Events

Connect with like-minded Atlassian users at free events near you!

Find an event

Connect with like-minded Atlassian users at free events near you!

Unfortunately there are no Community Events near you at the moment.

Host an event

You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events

Events near you