Are you in the loop? Keep up with the latest by making sure you're subscribed to Community Announcements. Just click Watch and select Articles.

Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Earn badges and make progress

You're on your way to the next level! Join the Kudos program to earn points and save your progress.

Deleted user Avatar
Deleted user

Level 1: Seed

25 / 150 points

Next: Root


1 badge earned


Participate in fun challenges

Challenges come and go, but your rewards stay with you. Do more to earn more!


Gift kudos to your peers

What goes around comes around! Share the love by gifting kudos to your peers.


Rise up in the ranks

Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!


Repository size exceeds limit but GC is not working

Hello, one of our repositories recently exceed the size limit of 4GB.

We used the official guides and came to a point where we cannot push anymore because repo size is not changing apparently because the automatic GC is not working properly.

Can you help? Maybe you could run the GC manually against our repo?

Thank you very much in advance!

1 answer

0 votes
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
Jun 09, 2022


I have ran the git gc operation manually against your repository and the size has only decreased by 100mb. This is likely because you still have large binary files present in your repository which are not stored in GIT LFS.

I would recommend looking through your repository to find file extensions which include large binary files and then making use of the BFG Tool to convert existing large files to GIT LFS.

A guide can be found here:


- Ben (Bitbucket Cloud Support)

Hi Ben, thanks for your answer!

I'm working on Giovanni's repo and after that you GC manually on it I proceeded to perform the git filter-branch operation as described in 

The operations performed in detail are the following : 

git filter-branch --index-filter 'git rm -r --cached --ignore-unmatch our_biggest_file_path' --prune-empty

git for-each-ref --format='delete %(refname)' refs/original | git update-ref --stdin

git reflog expire --expire=now --all

git gc --aggressive --prune=now

git push --all --force

After that, our repo size instead of going down has gone up (now it's 4.15 GB) and now we are stuck again.

What did I do wrong? Could you help us?

Our team chose not to use LFS and to backup our biggest binary file on external cloud servers : thus, we are happy to loose all the history information about that file, hence the use of git filter-branch.

Thank you very much,


Like likes this
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
Jun 15, 2022

Hey Michele,

Apologies for the delay in response - I was OOO for a few days. 

I have ran a git gc and the repository size is still only 3.9GB, are there any other files which can be removed - given that you do not wish to use GIT LFS?

If you need further assistance, please let me know and I will raise a ticket formally so I may look into the backend of your repository and identify any problematic large files in your commit history.


- Ben (Bitbucket Cloud Support)


Could you please run again git gc on our repo since I removed another big file?

If it still doesn't solve our size problem, then yes, we could really use further assistance.

Thank you in advance,


Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
Jun 16, 2022

Hi @m_liscio @gtugnolo 

I have performed a subsequent gc and the size of the repo has only reduced by 200mb (it is now 3.7GB).

I have opened a support ticket on your behalf, you should receive an email with a link to the request - we will communicate further on there.


- Ben (Bitbucket Cloud Support)

Like m_liscio likes this

Suggest an answer

Log in or Sign up to answer
AUG Leaders

Atlassian Community Events