I have encountered an error while running my build, and would like to debug it locally so that I can play around with environment settings and not use up build minutes trying to get the environment working.
You can debug a Bitbucket Pipelines build locally by using Docker.
First, install Docker. To set it up will vary depending on which OS you are using. See this page for install instructions: https://docs.docker.com/engine/installation/
Have you bitbucket-pipelines.yml handy, we'll need to grab a few values and commands from here. For this answer, my bitbucket-pipelines.yml file looks as follows:
# You can use a Docker image from Docker Hub or your own container # registry for your build environment. image: python:2.7 pipelines: default: - step: script: # Modify the commands below to build your repository. - python --version - python myScript.py
Make sure you have a local copy of your Bitbucket repository. If not clone it, for example:
$ cd /Users/myUserName/code $ git clone git@bitbucket.org:myBBUserName/localDebugRepo.git
If you run into issues cloning, check out this resource: https://confluence.atlassian.com/display/BITBUCKET/Clone+a+repository
Now, assuming you have Docker installed, you can log into the Docker container using the following command:
$ docker run -it --volume=/Users/myUserName/code/localDebugRepo:/localDebugRepo --workdir="/localDebugRepo" --memory=2048m python:2.7 /bin/bash
Lets deconstruct this command to understand, at a very basic level, what you are doing here:
docker run -it
Run a Docker container with a TTY and with STDIN open
--volume=/Users/myUserName/code/localDebugRepo:/localDebugRepo
Mount the directory /Users/myUserName/code/localDebugRepo inside the container as the directory /localDebugRepo
--workdir="/localDebugRepo"
Set the directory we start in.
--memory=2048m
Optional: This runs the container with 2GB of memory. Not required for local debug, but Bitbucket Pipelines restricts your container to 2GB of memory, so some issues may be related to running out of memory.
python:2.7
The Docker image we are going to run
/bin/bash
Starts a bash prompt.
After running:
$ docker run -it --volume=/Users/myUserName/code/localDebugRepo:/localDebugRepo --workdir="/localDebugRepo" --memory=2048m python:2.7 /bin/bash
We are now inside of the python:2.7 container in our repo directory. We can now run each individual command of our bitbucket-pipelines.yml.
$ python --version Python 2.7.11 $ python myScript.py Hello world!
We can configure things like normal inside our container, for example:
$ pip install scipy Collecting scipy Downloading scipy-0.17.1-cp27-cp27mu-manylinux1_x86_64.whl (39.5MB) 100% |████████████████████████████████| 39.5MB 34kB/s Installing collected packages: scipy Successfully installed scipy-0.17.1
Once you get things working as needed in your local Docker container. Note the changes, add them to your bitbucket-pipelines.yml file, and push them back up to Bitbucket to build in Pipelines!
If you want to try building your own Docker container, check out this answer: https://answers.atlassian.com/questions/39140980
This should definitely be in the Pipelines documentation
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Just a note, since the Pipelines machines that run the docker containers do not have swap enabled, use these parameters to simulate it locally:
--memory=2048m --memory-swap=2048m
This way, if the run fails in Pipelines due to memory limit, it will fail locally too.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
What if my project is dependent on some external files (i.e. Android SDK directory). How do I make Docker Bitbucket recognize these paths?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
If you're dependent on external files, you will need to download them once the Docker container starts running. This means in your bitbucket-pipelines.yml you will need to download the external files at the start of your build.
Here's some resources that may be able to help you with this:
Alternatively, you can create a Docker image which has the external files burnt on.
Does that help?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
What if I have a dependency on the database and I want to debug that thing?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
This woefully falls short of debugging our `bitbucket-pipelines.yml`
A real solution should simulate:
- all the environment variables
- the default location of cloned files
- availability of artifacts from previous steps
- ability to run through and be prompted when manual steps are encountered.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
This answer lacks elements:
1. How to handle the "docker" abilities of Bitbucket (ie. you can invoke docker from node:10 while in bitbucket but not while in local)
2. Memory definitions per dependency (definitions section at the end)
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
well needed!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
In case you have a special entry point defined in your dockerfile (e.g. a startup script) you might also overwrite the entrypoint or you might not be able to run the bash command. Futhermore the --rm comes in handy (removes the container after you ran it) in case you do not want to reuse the created container afterwards...
docker-run -it --rm --entrypoint "" .......
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You can also try using the local-pipelines tool:
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
local-pipelines isn't being maintained anymore. However, I wrote a new tool called convey that can run your bitbucket-pipelines.yml via
convey -l bitbucket
Support for services is in the works but going to be a little bit has I have to restructure some things to deal with the way the network is setup between services.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.