Bitbucket Data Center disk sizing

Jozef Vandenmooter April 5, 2019

I have been looking for sizing recommendations for the Bitbucket Data Center home directory but can't find any. Can someone point me to a webpage?

Our current Bitbucket Home partition is 17GB  in size  and runs out of space when cloning a large repositiory (30GB). That appears to be caused by the caches directory filling up.

Note that the shared directory (which includes the data directory) is mounted on a share that's 200GB in size, so the problem is not related to repository sizing (at least not directly).

 

1 answer

1 vote
Ana Retamal
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
April 8, 2019

Hi Jozef,

As a rule of thumb 1.5 x (or 2x) the size of all repositories combined in disk (contents of the .git/objects directory). Same rule goes for memory.

For further reference, here's a great article to get some insights about
Scaling Bitbucket Server.

Hope that helps!

Ana

 

EDIT:

I've been talking about this with some colleagues and they suggested to disable ref-advertisement caching (since they suspect that's what is filling the cache directory). It's also safe to say that 17GB drive can’t hold a cache for 30GB, it would be a safe bet to match the filesystem size of your network drive (assuming your network drive is only their for Bitbucket’s purposes). Also, the caching should stop if the available disk space drops below 5 GB (that can be configured in bitbucket.properties)

plugin.bitbucket-scm-cache.minimum.free.space - Controls how much space needs to be available on disk (specifically under <Bitbucket home directory>/caches) for caching to be enabled. This setting ensures that the cache plugin does not fill up the disk.

This value is in bytes. (default - 5368709120)

For more info you can read Bitbucket Server config properties


Lastly, a 30GB repository sounds massive, if you have large files (like images, videos, binary files) you can also use git-lfs.

Hope you find this useful, Jozef.

Kind regards,

Ana

Jozef Vandenmooter April 9, 2019

Ana,

Thanks for the reply and the link to that article.

Yes, this one repo is indeed massive. It's being migrated from SVN to Git (Bitbucket) and we actually only included about half of its history. LFS is enabled, but my understanding is that only affects new files.

This is for Data Center, so we thought only the shared directory had to be sized based on the size of the repositories. The client is going to be disappointed if we tell them the home directories on all the nodes have to be sized the same as the shared data directory (200GB), so I will ahve to look into those caching settings instead. 

We provisioned 8GB of RAM for the BB nodes. That is based on the following article:

https://confluence.atlassian.com/bitbucketserverkb/bitbucket-server-is-reaching-resource-limits-779171381.html

Two more questions:

1. There is no entry for plugin.bitbucket-scm-cache.minimum.free.space in the bitbucket.properties file. Does that mean it is using the default setting (5GB)? That does not seem to be the case as the amount of disk space on the Bitbucket's Home directory's mount point goes to zero during a clone operation. I'm actually surprised Bitbucket does not go down because of it.

2. You suggest to turn off ref advertisement caching. However, according to the article referenced in your reply, ref advertisement caching is disabled by default.

Thanks,

Jozef  

Jozef Vandenmooter May 22, 2019

Hi Ana,

Have you been able to look into this?

Thanks,

Jozef 

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events