I used git_find_big.sh file to file the biggest files
And I identified one file that is very old and can be safely deleted and ran the command below
git filter-branch --index-filter 'git rm --cached --ignore-unmatch GradeBrains/Pods/HockeySDK/HockeySDK-iOS/HockeySDK.embeddedframework/HockeySDK.framework/HockeySDK' HEAD
After i rewrote the history - i tried to push - it showed a lot of items to pull in source tree and push failed.
When I pulled and pushed - the repo size is now at 1.89 GB
In my local - it is still showing the file as not deleted.
Can u help suggest some possible way out.
Hi Ram,
You are right that when you rewrite history and push to the Bitbucket repo, a git gc is needed on the remote repo for the old references to get removed and for the repo's size to get reduced.
I see that you opened a support ticket as well and got assistance for this issue.
Just wanted to double check with you and see if you need anything further on this?
I also wanted to mention, for any other users who may come across your question, for a git gc on a repo you can create a support ticket via https://support.atlassian.com/contact/#/, in "What can we help you with?" select "Technical issues and bugs" and then Bitbucket Cloud as product.
Kind regards,
Theodora
Based on various threads - I did realize that i should force the push if i used filter index.
I also tried this and ended up making the repo 2.05 GB. So I ran out of options here.
Is there a way to delete the files and bring the repo below 1GB.
In my local it is only 940 MB.
du -hs .git/objects
940M .git/objects
git count-objects -Hv
count: 0
size: 0 bytes
in-pack: 151371
packs: 2
size-pack: 936.56 MiB
prune-packable: 0
garbage: 0
size-garbage: 0 bytes
Will running GC on server side fix the issue?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.