You're on your way to the next level! Join the Kudos program to earn points and save your progress.
Level 1: Seed
25 / 150 points
Next: Root
1 badge earned
Challenges come and go, but your rewards stay with you. Do more to earn more!
What goes around comes around! Share the love by gifting kudos to your peers.
Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!
Join now to unlock these features and more
Hello, one of our repositories recently exceed the size limit of 4GB.
We used the official guides and came to a point where we cannot push anymore because repo size is not changing apparently because the automatic GC is not working properly.
Can you help? Maybe you could run the GC manually against our repo?
Thank you very much in advance!
G'day!
I have ran the git gc operation manually against your repository and the size has only decreased by 100mb. This is likely because you still have large binary files present in your repository which are not stored in GIT LFS.
I would recommend looking through your repository to find file extensions which include large binary files and then making use of the BFG Tool to convert existing large files to GIT LFS.
A guide can be found here:
https://support.atlassian.com/bitbucket-cloud/docs/use-bfg-to-migrate-a-repo-to-git-lfs/
Cheers!
- Ben (Bitbucket Cloud Support)
Hi Ben, thanks for your answer!
I'm working on Giovanni's repo and after that you GC manually on it I proceeded to perform the git filter-branch operation as described in https://support.atlassian.com/bitbucket-cloud/docs/maintain-a-git-repository
The operations performed in detail are the following :
git filter-branch --index-filter 'git rm -r --cached --ignore-unmatch our_biggest_file_path' --prune-empty
git for-each-ref --format='delete %(refname)' refs/original | git update-ref --stdin
git reflog expire --expire=now --all
git gc --aggressive --prune=now
git push --all --force
After that, our repo size instead of going down has gone up (now it's 4.15 GB) and now we are stuck again.
What did I do wrong? Could you help us?
Our team chose not to use LFS and to backup our biggest binary file on external cloud servers : thus, we are happy to loose all the history information about that file, hence the use of git filter-branch.
Thank you very much,
Michele
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hey Michele,
Apologies for the delay in response - I was OOO for a few days.
I have ran a git gc and the repository size is still only 3.9GB, are there any other files which can be removed - given that you do not wish to use GIT LFS?
If you need further assistance, please let me know and I will raise a ticket formally so I may look into the backend of your repository and identify any problematic large files in your commit history.
Cheers!
- Ben (Bitbucket Cloud Support)
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Could you please run again git gc on our repo since I removed another big file?
If it still doesn't solve our size problem, then yes, we could really use further assistance.
Thank you in advance,
Michele
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
I have performed a subsequent gc and the size of the repo has only reduced by 200mb (it is now 3.7GB).
I have opened a support ticket on your behalf, you should receive an email with a link to the request - we will communicate further on there.
Cheers!
- Ben (Bitbucket Cloud Support)
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.