Sergey is correct. In FishEye/Crucible, the SQL database is primarily used for Crucible data. The FishEye data is stored locally in files using an embedded database and also in Lucene indexes.
Given that, the sizing of the SQL database will be driven mostly by the number of reviews, the number of file revisions in each review and the number of comments made on each review.
A ballpark figure might be to allow:
Given a review average of 30 file revisions and 20 comments, you'd arrive at 42k per review. So for 20k reviews, you'd be looking at 800 Megabytes.
Alternatively, if you were to create 10 reviews per week, your Crucible data usage will be around 22 MBytes per year.
This is, of course, a Fermi calculation of sorts and your actual usage is going to depend a lot on the number of reviews you conduct, the complexity of the reviews and how much discussion there is within the reviews.
Here's some info from a live instance (numbers are average row size in bytes). Largest table by row count is cru_fr_detail, btw (10x larger than next largest, cru_revision).
| cru_base_star_model | 309 | | cru_changeset_comment | 16384 | | cru_comment | 251 | | cru_comment_field | 116 | | cru_comment_read_status | 50 | | cru_committer_user_mapping | 122 | | cru_content_root | 204 | | cru_field | 78 | | cru_file_read_status | 47 | | cru_fr_detail | 71 | | cru_frx | 62 | | cru_frx_comment | 42 | | cru_frx_revision | 49 | | cru_inline_comment | 31 | | cru_inline_comment_to_frx_rev | 39 | | cru_logitem | 164 | | cru_metric_definition | 16384 | | cru_notification | 56 | | cru_patch | 348 | | cru_patch_revision | 50 | | cru_perm_scheme | 218 | | cru_proj_allowed_grp | 212 | | cru_proj_default_grp | 655 | | cru_proj_default_reviewer | 16384 | | cru_project | 218 | | cru_ps_all_user | 72 | | cru_ps_group | 75 | | cru_ps_review_role | 76 | | cru_recently_visited | 148 | | cru_recipient | 64 | | cru_review | 663 | | cru_review_comment | 37 | | cru_review_participant | 51 | | cru_revision | 120 | | cru_revpermaid | 60 | | cru_state_change | 52 | | cru_stored_path | 151 | | cru_upload_item | 106 | | cru_user | 137 | | cru_user_profile | 142 | | cru_version | 8192 |
I found this question about database size for Stash, is this the one you are refering to? In that question Michael's calculation is very smart, but keep in mind that it should be considered a guideline and not a rule.
I believe we could use the same logic for Crucible. However, since Crucible and Fisheye use the same database, it should be included in the calculation formula.
So, I'd say 100MB + ((total number of commits across repos + total number of reviews per repo) / 2500) MB
Using the same calculation basis, if you have 20 repositories in Fisheye with an average of 25,000 commits and 20,000 reviews you'd need 100Mb + (20 * (25,000 + 20,000) / 2500) MB = 460MB.
I hope this helps.
If you already heard about Smart Commits in Bitbucket, know that you just stumbled upon something even better (and smarter!): Genius Commits by Better DevOps Automation for Jira Data Center (+ Server...
Connect with like-minded Atlassian users at free events near you!Find an event
Connect with like-minded Atlassian users at free events near you!
Unfortunately there are no Community Events near you at the moment.Host an event
You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events