Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Next challenges

Recent achievements

Recognition

  • Give kudos
  • My kudos

Leaderboard

  • Global

Trophy case

Kudos (beta program)

Kudos logo

You've been invited into the Kudos (beta program) private group. Chat with others in the program, or give feedback to Atlassian.

View group

It's not the same without you

Join the community to find out what other Atlassian users are discussing, debating and creating.

Atlassian Community Hero Image Collage

New Repository Exceeding Limit - Repository is ONLY 107MB

A newly created Bitbucket repository 'thepls-dev' shows a 2.6GB repo size when we brought the size down to just over 100 MB.  You can see the initial commit here:

Writing objects: 100% (23260/23260), 106.95 MiB | 713.00 KiB/s, done.

We specifically removed large files from the tree/commit before even adding the files to our commit.  I cloned the master branch and looked at the file size and it's 107MB in size.  Why is the repo size set to standards of two decades ago?  These are private repositories where many sites if not all client sites exceed 2 GB.  Shouldn't include resources?  That's why we excluded them through .gitignore.  I would gladly pay $20-100 USD/month to boost the maximum repository size.

2 answers

Are you guys going to fix this?  I even cloned the master locally, just to be absolutely certain.  Once the clone was completed, the full size of the folder is ~500MB.  Come on guys, run your little automated tool and fix this for us.  We have been waiting on it all day.  Absolutely pathetic.  Why don't you offer some premium services so that we can get some assistance?

0 votes

Hello @Nitro_Interactive,

Repository size includes entire history of your repository, not just its latest state (working directory). So if you have a large file somewhere in the history, it will still contribute to the overall repository size.

Why is the repo size set to standards of two decades ago?

The main reason here is performance of Git on large repositories. In fact, it is generally recommended to keep your repository under 1Gb. If you have to check in large files, you might want to make use of Git LFS (large file storage) – it was designed exactly for this reason.

I would gladly pay $20-100 USD/month to boost the maximum repository size.

Again, the reason for a limit is performance of Git which is a known by-design issue, and there's no easy way to improve it. We can't make a trade-off here.

Are you guys going to fix this? We have been waiting on it all day. Absolutely pathetic. Why don't you offer some premium services so that we can get some assistance?

Community is not the best place to seek for help like this as it's not being actioned immediately. As mentioned in the Reduce repository size guide, a request should be filed  with our Support team. For now I found the repository in question and triggered garbage collection – repository size is now down to 110Mb.

Cheers,
Daniil

Hi Daniil,

Despite my previous tone, I appreciate it.  Attempting to file a support request lead me to the support forum.  Unfortunately, I do not know your system as you do, so I apologize for filing my request in the wrong place.  The history contains all of one push, but perhaps the initial commit was the cause.  That seems bizarre to me that you have to trigger garbage collection manually, it seems like it should be automated or triggered whenever the 2GB limit has been reached to ensure it is not a false-positive (on top of the normally scheduled garbage collection – assuming that already exists).  Thanks again, frustration can be all-consuming.

Kind regards,

Steve Giorgi

That seems bizarre to me that you have to trigger garbage collection manually, it seems like it should be automated or triggered whenever the 2GB limit has been reached to ensure it is not a false-positive (on top of the normally scheduled garbage collection – assuming that already exists).

It is automated, however it doesn't trigger on every single push – because it is quite expensive operation. Moreover, when GC kicks in it may run with different modes: more or less aggressive. Especially when the repository becomes huge, GC takes significant time, and given that we don't know how much garbage is in repo before we collect it, we should be careful to not to run GC again and again if it doesn't make any value. I believe this was the case with your repository.

Cheers,
Daniil

Suggest an answer

Log in or Sign up to answer
TAGS
Community showcase
Published in Bitbucket

Powering DevOps with Bitbucket Server & Data Center

Hi everyone, The Cloud team recently announced 12 new DevOps features that help developers ship better code, faster   ! While we’re all excited about the new improvements to Bitbucket ...

1,922 views 0 7
Read article

Community Events

Connect with like-minded Atlassian users at free events near you!

Find an event

Connect with like-minded Atlassian users at free events near you!

Unfortunately there are no Community Events near you at the moment.

Host an event

You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events

Events near you