You're on your way to the next level! Join the Kudos program to earn points and save your progress.
Level 1: Seed
25 / 150 points
1 badge earned
Challenges come and go, but your rewards stay with you. Do more to earn more!
What goes around comes around! Share the love by gifting kudos to your peers.
Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!
Join now to unlock these features and more
Yesterday, I reset the remote to a commit, as those after it caused us to go over the 4gig limit. After resetting (hard) and force pushing, the size went back under the 4gig limit (thanks to the Labs auto garbage collection).
However, today, we have somehow hit that limit again, I have no idea why. Obviously, someone pushed a large commit but I cannot work out which commit it was.
I have turned on the 'labs' auto garbage collection, but despite going back to the same good commit as I did yesterday (everyone backed up their work locally), it's still not clearing the bloat. Any help would be greatly assisted.
Please note, we are still very green with regards to GIT, so please bear with me.
Thank you for any and all assistance.
For Mac OSX/Linux the following command can be executed to retrieve the list of largest files, along with their respective commit hash in the remote repository - this will require that coreutils is installed (see this article if you receive an error):
git rev-list --objects --all \ | git cat-file --batch-check=\ | sed -n \ | --numeric-sort --key=2 \ | -n 10 \ | -c 1-12,41- \ | $( -v gnumfmt || ) --field=2 --to=iec-i --suffix=B --padding=7 --round=nearest
There is an article here that explains this further: https://stackoverflow.com/questions/9456550/how-to-find-the-n-largest-files-in-a-git-repository
If you require a garbage collection to be performed in the backend to decrease the repo size, please let me know and I will raise a formal support ticket with you.
- Ben (Bitbucket Cloud Support)