It was brought to our attention that one of our repositories at: [REDACTED] are becoming too large (2.5GB) because we were tracking binary files. We re-wrote history and added LFS to the repo to manage the binaries. Then we ran git garbage collector (locally) and reduced the size to 600MB.
But the remote still says 2.5GB even after force pushing. After reading more here, https://support.atlassian.com/bitbucket-cloud/docs/reduce-repository-size/ we realized that someone needs to run garbage collector from remote side.
Can we get someone to run rc on the repo.
Hi Arsh,
I've successfully run garbage collection against your remote repository in the back-end to clean up any dangling commits.
The repository size has reduced from 2.5GB and is now currently 574MB.
Whilst it is not possible to run a gc manually from your side (we do have automated gc processes that run when certain garbage threshholds are met) - we want to inform you that we do have a Feature request https://jira.atlassian.com/browse/BCLOUD-11610 to provide users with the option to limit the filesize that is able to be pushed to GIT to prevent such issues from occurring again.
Our Dev team is currently working on implementing the feature request and we don't have any ETA for its implementation. I would suggest you "Watch" this ticket for future updates from our Dev team and "Vote" for it to improve its visibility with regard to customer demand.
NOTE: Please ensure that you and your colleagues perform a fresh clone of the repository to ensure that you do not accidentally push old local refs back to the repository, as this could inflate the size again.
Cheers!
- Ben (Bitbucket Cloud Support)
Thanks for the help!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.