Hi,
We are looking at migrating a large number (600+) of git repositories from Gitlab to Bitbucket Cloud as part of this move. While the Bitbucket import tool does a good job on importing the repos we’ve tried it on, it’s not really practical to manually import that many. Is there an API or HTTP request that we can use to automate this process?
The REST API documentation doesn’t list anything related to importing repositories; only creating new ones.
I don't think that there is an API dedicated to imports.
But, I think if you can create repos through the API, then you can easily develop a script which:
Thanks Aron, that is the option we're looking at but I was hoping there was an API which would make things slightly easier and reduce the possible number of failure points.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Actually, this should not be so difficult to implement with Rest-Api. I did something similar last year. The only difference being that in my case the basis for the newly created repositories was a local folder structure with around 100 top level folders.
For each of those folders i created a bitbucket repository (repo-name = folder-name) and set all (non-standard & non-inheritable) properties according to a config (json) file. If you want to do the same thing i would recommend structuring the json exactly the same way as the body/payloads that you need later when invoking the post/put api-calls. Any common settings that you can inherit from the parent workspace/project(s) should be set there, then you don't need to deal with the api there....
My repo settings included things like: basic repo infi/descriptions, branch restrictions, repository variables, user- and group permissions and for sure some more stuff that i just can't remember right now...
Well, since we also used build/deploy pipelines, i had to create deployment-environments plus their variables and additionally handle any pipeline/deployment-specific repo-settings.
Then i copied a ".gitignore" and "bitbucket-pipelines.yml" file to each directory (based on templates plus some minor replacements according to the processed folder). Next thing was the whole git-part: i locally initialized git in each folder and added/committed/pushed the containing files to the newly created repos (after configuring git remotes of course...).
Since it seemed very likely that some settings will be changed, i built the script to (conditionally) create-or-update on all levels. This turned out to be a very wise decision a few weeks later ;)
In your case the process should be simpler, since you already have the git repos initialized. I have no clue about Gitlab, but i'm sure you will find a way to automatically clone/pull all repos to your local machine without too much pain.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.