I am looking for standardized way, or close to standardized way (maybe using bitbucket's API) to make periodic backups of our git repos.
What are sane alternatives for this?
Assume that we already have a machine and storage in a safe location that can poll for changes and clone repos.
We do this (and I know a few others do the same). We developed a dead-simple script that "git pulls" all our repositories from Bitbucket Cloud and then "git pushes" them to GitHub. And we put this to a regular Linux CRON job and run in every 5 minutes.
Basically, we have a perfect copy of all repos with the max latency of 5 minutes. It works flawlessly. Bonus: if Bitbucket Cloud were down, we could continue collaboraing on GitHub.
Great to hear - this is a generic solution not using the Bitbucket API.
If you think that this solution is optimal and have maybe checked the API a bit for "better flows" then that would save me some time on research before implementation.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
To be honest, I haven't checked if there are better, Bitbucket Cloud specific solution to the problem.
Why? I was looking for a solution that I can use by backup our Bitbucket Cloud, Bitbucket Data Center, other_potential_Git_hosting_service_here repositories with a single script. Also I didn't want to touch the script if the API introduces breaking changes. In fact, low cost, simplicity and robustness are key advantages of my approach. (I ever have to throw it out, we may waste a few hours max.)
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Makes sense.
Either way, thanks for the quick answer.
My approach will probably be generic although I will use some programming language with lib to Prometheus so I can monitor the backup process and available server storage.
Many thanks,
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Because it was successfully increased to 256, it means that it is at least two bytes, so I'm pretty safe until 65K accepted answers... :-D
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Question, in the script you wrote, do you have to hard-code your repo names? We have over 100 repos, and I don't think a good backup strategy would be to remember to hard-code in every new repo name.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Well, both Bitbucket Cloud and GitHub offers REST APIs to get all repositories, so you don't have to enumerate them if you want to backup all.
If, for example, you can match the repo on the two sites based on the repository slug, then it is easy to implement a "backup all repos" logic.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
hello Aron,
I was wondering if i want to have this procedure of backing up all my repo from bitbucket and push them to Github, how can i obtain this procedure.
Thanks.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi @Roy Nard ,
You can take a look at GitProtect.io, an automated backup & DR software for Bitbucket. With the solution, you can set a scheduler for your backups to be performed automatically. Moreover, the solution permits you to bring your own storage.
You can find out more about the GitProtect on the Atlassian Marketplace: https://marketplace.atlassian.com/apps/1225728/gitprotect-io-backup-for-bitbucket?hosting=cloud&tab=overview
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.