I have a repo that's over the size limit at 2.1gb. I want to backup the repo before I remove large files because I'm going to have to rewrite a lot of history, but I'm prevented from cloning the repo if it's over 2gb. How do we make a backup before cleaning up the repo?
Can we have garbage collection run to trim the repo down and then remove additional large files?
I saw that you created a ticket with our support related to the same question, so to avoid any possible misunderstandings or miscommunication that may arise from discussing the same matter on two different platforms, let’s focus on the ticket instead.
Just in case someone else would like to create a backup, as Git is a distributed version control, any clone/pull of the repository is essentially a copy of your code.
So, in order to make backups of your Bitbucket contents, you can create local backups using clones from Bitbucket, more specifically mirror clones as it copies all your repository references locally. You can achieve this by running the following command:
git clone --mirror <repoURL>
The --mirror flag clones all branches and commits of the specified repository.
Please note that metadata such as settings and Pull requests wouldn't be part of this backup, you would need to re-create this data from scratch if you lost the repository.
Following our suggestion to clone a repository locally as a mirror, you might have questions on how to turn a mirror repository (bare) into a normal repository (non-bare). For that, you can push the mirror-cloned repository into Bitbucket and then clone it without the mirror flag, it will automatically do that for you.
Beginning on April 4th, we will be implementing push limits. This means that your push cannot be completed if it is over 3.5 GB. If you do attempt to complete a push that is over 3.5 GB, it will fail...