The /data usage on our Bitbucket server is getting bigger. There is some gc cleanup scripts running but I suspect that there are things that could be ( for example find /data/docker-registry/ -mtime +900 ) Or we need to know why/motivate the expanding costs of adding 1TB each 6-8 month.
Any good advice how to handle and create good clean up routines on Bitbucket?
Hi Ulrika,
First, it would be advisable to analyze what is consuming space. Sometimes a lot of new data in the repository is normal. However, if you're adding binary files then it would be better to use LFS. Otherwise, every push adds large binary files again and again.
To learn more about this, you can read Git Large File Storage.
You can also setup monitoring of the disk usage and check every month or so if the data should be there or not by contacting the users that put it there.
Hope that helps!
Ana
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.