Repository is not over the size limit

Dana Adriana Zainescu January 9, 2020

It says it has 1.13 GiB, but then it says repository is over the size limit (2 GB) and will not accept further additions:

Enumerating objects: 25046, done.
Counting objects: 100% (25046/25046), done.
Delta compression using up to 4 threads
Compressing objects: 100% (17794/17794), done.
Writing objects: 100% (25046/25046), 1.13 GiB | 8.51 MiB/s, done.
Total 25046 (delta 7237), reused 24962 (delta 7170)
remote: Resolving deltas: 100% (7237/7237), done.
remote: Checking connectivity: 43, done.
remote: Repository is over the size limit (2 GB) and will not accept further additions.
remote:
remote: Learn how to reduce your repository size: https://confluence.atlassian.com/x/xgMvEw.
To bitbucket.org:geopost/<repo>.git
! [remote rejected] master-live -> master-live (pre-receive hook declined)
error: failed to push some refs to 'git@bitbucket.org:geopost/<repo>.git'

 I've also tried to find the site of the repo with:

git gc
git count-objects -vH

And it also says:

size-pack: 1.13 GiB

 On the other hand, as an example, we have another repo that is 2.21GB and commits still work fine there.

EDIT: I tried creating and deleting a branch, as it was suggested online and commits work now. It says Repository details: Size1.9 GB on Bitbucket.
Any idea how can I get it down to 1.13 GB, as git tells me this is the actual size of the repo?

2 answers

1 accepted

1 vote
Answer accepted
Daniil Penkin
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
January 12, 2020

Hello @Dana Adriana Zainescu,

This is very likely due to the fact that Git garbage collection is not triggered on every repository update as it is an expensive operation. This is why one of the steps in the guide Mike mentioned above is to request our Support team to run GC.

Eventually it would've been triggered automatically, but for now I forced GC in four repositories under the account you mentioned that matched the size you mentioned and last access date (since you didn't specify exact repo in question). All repositories reduced in size to 1.1-1.3 GB. Please let me know if I guessed the right repo.

Hope this helps. Let me know if you have any questions.

Cheers,
Daniil

Dana Adriana Zainescu January 13, 2020

Thanks, Daniil.
Have you also done this for dpd.co.uk? Git is telling me it has 2.21 GiB, instead of the 4.6 GB on Bitbucket. Can you have a look please? And I'll also try to reduce the size a bit more after that.
Also, I suppose there's no way of actually increasing the space limit for the repo? As it's a bit hard to restrict commits, since they're sent from the Alfresco WCM host serving the live website.

Like Lenin Raj likes this
Daniil Penkin
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
January 13, 2020

No worries.

Have you also done this for dpd.co.uk?

Done just now, Bitbucket is showing 2.3 GB now.

Also, I suppose there's no way of actually increasing the space limit for the repo?

No, unfortunately this one is carved in stone. This is mainly due to the potential performance issues that Git has with larger repositories. There are a couple workarounds for this though: you can make use of Git LFS for large files or split the repo history like it is described in this Git guide chapter. Both ways aren't trivial and might not be easily applicable in your case.

Cheers,
Daniil

Like Lenin Raj likes this
Dana Adriana Zainescu January 14, 2020

I see, thanks for the help.

Like Lenin Raj likes this
letiagoalves March 3, 2020

hi @Daniil Penkin , one repo of mine exceeded the limit and blocked me for making more additions.

I rewrite the repo history and now my repository is under 500mb. However, when I push it to the remote it continues to fail with the following error:

Counting objects: 5314, done.
Delta compression using up to 8 threads.
Compressing objects: 100% (1953/1953), done.
Writing objects: 100% (5314/5314), 256.71 MiB | 2.29 MiB/s, done.
Total 5314 (delta 3501), reused 4815 (delta 3055)
remote: Resolving deltas: 100% (3501/3501), done.
remote: Checking connectivity: 5314, done.
remote: Repository is over the size limit (2 GB) and will not accept further additions.
remote:
remote: Learn how to reduce your repository size: https://confluence.atlassian.com/x/xgMvEw.
! [remote rejected] master -> master (pre-receive hook declined)
! [remote rejected] feature/car-mode -> feature/car-mode (pre-receive hook declined)

can you please help me with this?

Daniil Penkin
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
March 3, 2020

Hello @letiagoalves,

If I identified your repository correctly, it should reflect the smaller size now. Not under 500mb  as you mentioned, but still way under the limit.

Cheers,
Daniil

letiagoalves March 3, 2020

Thank you for your help. It is reflecting the new size.

The 500mb size I mentioned is the local repo that I was not able to push --force because of the pre-receive hook.

But now I was able to push it.

 

Thanks again.

Like Daniil Penkin likes this
Deleted user December 17, 2020

Hello @Daniil Penkin ,

I'm in an urgent situation and having an issue with exceeding the limited size of the repository.

Can you also help me with that?

Daniil Penkin
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
December 17, 2020

Hello @[deleted],

I think I figured which repository you meant – GC reduced it to around 450 Mb.

Cheers,
Daniil

Deleted user December 17, 2020

@Daniil Penkin  BIG BIG thank you for the help, it's been so frustrating since yesterday.

What can I do to avoid this in the future?

I already enabled the Delete dangling commits when oversize limit.

Daniil Penkin
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
December 17, 2020

No worries, happy to help :)

Aside from enabling Bitbucket feature you mentioned I can only advise to set up client-side Git hook which will raise a flag if you're trying to commit some big file (I'm assuming you accidentally committed some large files to your repository). With such configuration you'll need to explicitly skip that hook in order to commit a large file, should you need that (which would be a rare operation I guess).
It can be a simple script, or some more sophisticated tool. I personally use check-added-large-files plugin for pre-commit, but there're other tools as well.

Let me know if you have any questions.

Cheers,
Daniil

1 vote
Mike Howells
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
January 10, 2020

Please refer to the guide to reducing repo size mentioned in the error message for information on the size limits and instructions on getting down below the 2GB limit.

You may have to request technical support to get your repo below the 2GB limit. You can ask the support engineer why your 2.21GB repo is not affected by the limit.

Once below the limit, the guide to maintaining git repos explains how to remove other large files from your repo.

Apologies for your confusing experience. We are currently investigating some issues with the way repo sizes are calculated. This is a tricky area that has a big performance impact so we're being very cautious, rolling out changes progressively and monitoring their effects carefully.

Dana Adriana Zainescu January 13, 2020

Thanks for the information. Will have a look over those links!

Like Lenin Raj likes this

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events