Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Earn badges and make progress

You're on your way to the next level! Join the Kudos program to earn points and save your progress.

Deleted user Avatar
Deleted user

Level 1: Seed

25 / 150 points

Next: Root


1 badge earned


Participate in fun challenges

Challenges come and go, but your rewards stay with you. Do more to earn more!


Gift kudos to your peers

What goes around comes around! Share the love by gifting kudos to your peers.


Rise up in the ranks

Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!


Come for the products,
stay for the community

The Atlassian Community can help you and your team get more value out of Atlassian products and practices.

Atlassian Community about banner
Community Members
Community Events
Community Groups

Run Bitbucket Pipelines locally for testing

I understand that there is an official suggestion to run the same docker image on my local environment and run there all the script I am using in my bitbucket-pipelines.yml.

But I would like to actually use my bitbucket-pipelines.yml and test it "as is", before commiting it to the repo.

This is will give me the ability to tweak it as much as possible before commiting, thus saves me a lot of debuging commits.

Anyone has a way to do that?

1 answer

0 votes
Erez Maadani
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
May 16, 2023 • edited

Hi @TimorKalerman 

I haven't tried it but the general direction would be to set up a self-hosted runner, configure it and then trigger builds locally.

For setting up the self-host runner, have a look here:

Take a special look at the section #9, which talks about the pre-configured scripts which:

"automatically downloads and starts the necessary software to run build steps on your host"

That should explain how the triggering commands are getting to your runner and how they are converted into commands to run your build steps. 

The last part is to understand what is included the triggering command, so I guess you would have to do some commits to understand what info is actually sent to your runner.  

Hope that helps

Thank you @Erez Maadani , I will test it, but it will still require me to do commits for each fix instead of testing locally on my pc before I commit.

Anyway I guess currently there is no way for it, but hopefully Atlassian will release such tool.

I know Google Cloud Build for example has such tool for local debugging and it is very helpful.

Like Fredrik Gustafsson likes this
Theodora Boudale
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
May 17, 2023

Hi Timor,

If you are referring to debugging pipelines locally as per this guide, then there is no way to use the bitbucket-pipelines.yml file "as is". In this case, you run in the container the commands that you intend to use in your bitbucket-pipelines.yml file.

You can always create a fork of the repo for testing purposes and use it with a Linux Docker runner as Erez suggested (so that you don't consume build minutes), and eventually delete the fork. Please keep in mind that if you use any user-defined variables or deployment environments in the fork's builds, you will need to create them also in the repo where you want to run the builds eventually.

Kind regards,

Suggest an answer

Log in or Sign up to answer
Site Admin
AUG Leaders

Atlassian Community Events