Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Automating repository imports

Steve Morris
I'm New Here
I'm New Here
Those new to the Atlassian Community have posted less than three times. Give them a warm welcome!
January 8, 2023

Hi,

We are looking at migrating a large number (600+) of git repositories from Gitlab to Bitbucket Cloud as part of this move. While the Bitbucket import tool does a good job on importing the repos we’ve tried it on, it’s not really practical to manually import that many. Is there an API or HTTP request that we can use to automate this process?

The REST API documentation doesn’t list anything related to importing repositories; only creating new ones.

1 answer

0 votes
Aron Gombas
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
January 9, 2023

I don't think that there is an API dedicated to imports.

But, I think if you can create repos through the API, then you can easily develop a script which:

  1. iterates over the projects/repos in GitLab
    1. creates the corresponding project/repo in Bitbucket
    2. "git pulls" from GitLab
    3. "git pushes" to Bitbucket
    4. ..
Steve Morris
I'm New Here
I'm New Here
Those new to the Atlassian Community have posted less than three times. Give them a warm welcome!
January 10, 2023

Thanks Aron, that is the option we're looking at but I was hoping there was an API which would make things slightly easier and reduce the possible number of failure points.

Thomas Totter January 10, 2023

@Steve Morris 

Actually, this should not be so difficult  to implement with Rest-Api. I did something similar last year. The only difference being that in my case the basis for the newly created repositories was a local folder structure with around 100 top level folders.

For each of those folders i created a bitbucket repository (repo-name = folder-name) and set all (non-standard & non-inheritable) properties according to a config (json) file. If you want to do the same thing i would recommend structuring the json exactly the same way as the body/payloads that you need later when invoking the post/put api-calls. Any common settings that you can inherit from the parent workspace/project(s) should be set there, then you don't need to deal with the api there....

My repo settings included things like: basic repo infi/descriptions, branch restrictions, repository variables, user- and group permissions and for sure some more stuff that i just can't remember right now... 

Well, since we also used build/deploy pipelines, i  had to create deployment-environments plus their variables and additionally handle any pipeline/deployment-specific repo-settings.

Then i copied a ".gitignore" and "bitbucket-pipelines.yml" file to each directory (based on templates plus some minor replacements according to the processed folder). Next thing was the whole git-part: i locally initialized git in each folder and added/committed/pushed the containing files to the newly created repos (after configuring git remotes of course...).

Since it seemed very likely that some settings will be changed, i built the script to (conditionally) create-or-update on all levels. This turned out to be a very wise decision a few weeks later ;)

In your case the process should be simpler, since you already have the git repos initialized. I have no clue about Gitlab, but i'm sure you will find a way to automatically clone/pull all repos to your local machine without too much pain.

Like Aron Gombas likes this

Suggest an answer

Log in or Sign up to answer
DEPLOYMENT TYPE
CLOUD
PERMISSIONS LEVEL
Product Admin
TAGS
AUG Leaders

Atlassian Community Events