Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in
Deleted user
0 / 0 points
Next:
badges earned

Your Points Tracker
Challenges
Leaderboard
  • Global
  • Feed

Badge for your thoughts?

You're enrolled in our new beta rewards program. Join our group to get the inside scoop and share your feedback.

Join group
Recognition
Give the gift of kudos
You have 0 kudos available to give
Who do you want to recognize?
Why do you want to recognize them?
Kudos
Great job appreciating your peers!
Check back soon to give more kudos.

Past Kudos Given
No kudos given
You haven't given any kudos yet. Share the love above and you'll see it here.

It's not the same without you

Join the community to find out what other Atlassian users are discussing, debating and creating.

Atlassian Community Hero Image Collage

Repo size is more than local size Edited

Size of the bit bucket locally and in server showing difference . Locally it shows 800MB and in cloud its 2.7GB. can you run the git gc on 

1 answer

1 accepted

0 votes
Answer accepted

Hello @Ramu Kamath,

I ran GC on the repository you mentioned, and its size reduced to roughly 800MB.

Please note that Community is a public space (e.g. everyone on the Internet can see this post), and it's better to raise support tickets with us for requests like this.

Cheers,
Daniil

Hi @Daniil Penkin

Can you help us understand how you reduced the size and why earlier it was showing 2.7GB?

Thanks in advance

Regards,

Manjunath

Hi @Manjunath Rajappa,

I kicked off Git garbage collection on our side.

When you push changes to your repository, any objects that become orphan are not immediately deleted. Let's say you amended a commit by changing all of its files and pushed the new version – in that case the previous version of that commit (commit object, tree object and blob objects for every file) become loose and can be deleted. However, for the sake of performance Git just inserts the new objects and doesn't always delete the old objects straight away. Instead, it runs so called garbage collection from time to time, which is a mechanism to identify loose objects and delete  them.

Normally garbage collection is triggered automatically when you push commits to your repository, however it doesn't happen on every push because GC is potentially quite expensive operation, especially when the repository becomes large. Our Git configuration has some heuristics around when and what kind of GC to trigger (it has several levels of "aggressiveness").

So under normal circumstances any loose objects will eventually be deleted as long as you keep working on the repository by making new pushes. But for some time Bitbucket would show a larger repository size than it actually is. Now, the problem might  occur if you hit the size limit – in that case another mechanism kicks in which might put your repository in read-only state. Even if you managed to remove, say, a big file, GC might've not been triggered that time, so Bitbucket would still reflect larger size. This is when a manual GC might be required.

Does this make  sense?

Cheers,
Daniil

Suggest an answer

Log in or Sign up to answer
TAGS
Community showcase
Published in Bitbucket Pipelines

Bitbucket Pipelines Runners is now in open beta

We are excited to announce the open beta program for self-hosted runners. Bitbucket Pipelines Runners is available to everyone. Please try it and let us know your feedback. If you have any issue...

103 views 1 2
Read article

Community Events

Connect with like-minded Atlassian users at free events near you!

Find an event

Connect with like-minded Atlassian users at free events near you!

Unfortunately there are no Community Events near you at the moment.

Host an event

You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events

Events near you