Forums

Articles
Create
cancel
Showing results for 
Search instead for 
Did you mean: 

Hunting down excessive storage usage in bitbucket server

Bradley Moravek April 29, 2019

<Updated: made to make this question more understandable, and to reflect some of the actions I have done to help remove issues within this project>

Problem:  Clean up of bad branch and monster file.  The monster file was merged into master while the 100k files was sitting out on a defunct branch.

 

Clean up actions to date:

1 - Used BFG tool to kill the monster 4 gig file.  Cleanup actions seemed to have removed all traces of this file and the storage on the bitbucket server has improved.

2 - Used the GUI interface in bitbucket to remove a bad branch that had over 100k files which took up about 4 gig of data.

 

Results:

    The "du -sh 32" on the bitbucket server still shows 4+ gig of storage. 

    Running the git count-object on the clone shows the right number of files

    After a clone action local storage shows under 100 meg of data. 

   bfg does not show the monster file

Working with

  • Atlassian Bitbucket v6.1.2
  • git version 2.20.1
  • CentOS 7.x (patched)

 

   The major impact to build server management has been solved, the removal of the 4.xx gig monster file. 

 

However, the backups of the bitbucket server work area is sill sucking up 4 gig of storage that it should not.  This would not be a big issue, but the mount point under /var has only 25 gig of storage.   Also the 'pipe' for off site backup is kind of small, and that additional data does slow the transfer down a bit.

 

 

0 answers

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events