Hi all,
Yesterday, I reset the remote to a commit, as those after it caused us to go over the 4gig limit. After resetting (hard) and force pushing, the size went back under the 4gig limit (thanks to the Labs auto garbage collection).
However, today, we have somehow hit that limit again, I have no idea why. Obviously, someone pushed a large commit but I cannot work out which commit it was.
I have turned on the 'labs' auto garbage collection, but despite going back to the same good commit as I did yesterday (everyone backed up their work locally), it's still not clearing the bloat. Any help would be greatly assisted.
Please note, we are still very green with regards to GIT, so please bear with me.
Thank you for any and all assistance.
Regards,
Mike
Hi Michael,
For Mac OSX/Linux the following command can be executed to retrieve the list of largest files, along with their respective commit hash in the remote repository - this will require that coreutils is installed (see this article if you receive an error):
git rev-list --objects --all \
| git cat-file --batch-check='%(objecttype) %(objectname) %(objectsize) %(rest)' \
| sed -n 's/^blob //p' \
| sort --numeric-sort --key=2 \
| tail -n 10 \
| cut -c 1-12,41- \
| $(command -v gnumfmt || echo numfmt) --field=2 --to=iec-i --suffix=B --padding=7 --round=nearest
There is an article here that explains this further: https://stackoverflow.com/questions/9456550/how-to-find-the-n-largest-files-in-a-git-repository
If you require a garbage collection to be performed in the backend to decrease the repo size, please let me know and I will raise a formal support ticket with you.
Cheers!
- Ben (Bitbucket Cloud Support)
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.