Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Earn badges and make progress

You're on your way to the next level! Join the Kudos program to earn points and save your progress.

Deleted user Avatar
Deleted user

Level 1: Seed

25 / 150 points

Next: Root


1 badge earned


Participate in fun challenges

Challenges come and go, but your rewards stay with you. Do more to earn more!


Gift kudos to your peers

What goes around comes around! Share the love by gifting kudos to your peers.


Rise up in the ranks

Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!


Automating repository imports


We are looking at migrating a large number (600+) of git repositories from Gitlab to Bitbucket Cloud as part of this move. While the Bitbucket import tool does a good job on importing the repos we’ve tried it on, it’s not really practical to manually import that many. Is there an API or HTTP request that we can use to automate this process?

The REST API documentation doesn’t list anything related to importing repositories; only creating new ones.

1 answer

0 votes
Aron Gombas
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
Jan 09, 2023

I don't think that there is an API dedicated to imports.

But, I think if you can create repos through the API, then you can easily develop a script which:

  1. iterates over the projects/repos in GitLab
    1. creates the corresponding project/repo in Bitbucket
    2. "git pulls" from GitLab
    3. "git pushes" to Bitbucket
    4. ..

Thanks Aron, that is the option we're looking at but I was hoping there was an API which would make things slightly easier and reduce the possible number of failure points.

@Steve Morris 

Actually, this should not be so difficult  to implement with Rest-Api. I did something similar last year. The only difference being that in my case the basis for the newly created repositories was a local folder structure with around 100 top level folders.

For each of those folders i created a bitbucket repository (repo-name = folder-name) and set all (non-standard & non-inheritable) properties according to a config (json) file. If you want to do the same thing i would recommend structuring the json exactly the same way as the body/payloads that you need later when invoking the post/put api-calls. Any common settings that you can inherit from the parent workspace/project(s) should be set there, then you don't need to deal with the api there....

My repo settings included things like: basic repo infi/descriptions, branch restrictions, repository variables, user- and group permissions and for sure some more stuff that i just can't remember right now... 

Well, since we also used build/deploy pipelines, i  had to create deployment-environments plus their variables and additionally handle any pipeline/deployment-specific repo-settings.

Then i copied a ".gitignore" and "bitbucket-pipelines.yml" file to each directory (based on templates plus some minor replacements according to the processed folder). Next thing was the whole git-part: i locally initialized git in each folder and added/committed/pushed the containing files to the newly created repos (after configuring git remotes of course...).

Since it seemed very likely that some settings will be changed, i built the script to (conditionally) create-or-update on all levels. This turned out to be a very wise decision a few weeks later ;)

In your case the process should be simpler, since you already have the git repos initialized. I have no clue about Gitlab, but i'm sure you will find a way to automatically clone/pull all repos to your local machine without too much pain.

Like Aron Gombas likes this

Suggest an answer

Log in or Sign up to answer
Site Admin
AUG Leaders

Atlassian Community Events