Degraded performance Customers may experience intermittent errors using Community search. Our platform vendor is investigating.
Hello GIT Experts,
I want to understand, what is the default value of merge.renameLimit for a Bitbucket repository ?
The algo for searching renamed file is O(N^2). The more file git has to search, the more time will be needed. We hit merge slowness issue after some repository refactoring (moving many files to another repository) , and it may be due to a very big merge.renameLimit settings on bitbucket. Therefore, I would like to understand better the settings of this value
Bitbucket Server uses Git's default. `merge.renameLimit` defaults to `diff.renameLimit`. The documentation doesn't specify what that is because it's not always the same. When running `git diff`, the limit defaults to 400 (Git 2.13.0 source), and when running `git merge` it's 1000 (Git 2.13.0 source).
Prior to 4.0, we renamed thousands of files in the Bitbucket Server codebase as part of the rebrand from Stash. Since then, we've also done some fairly major reshuffling of our UI code, which also renamed thousands of files. We've increased our rename limit from the default 400/1000 to 5000 for both, and haven't noticed a significant impact on performance. That's not to say it couldn't be the culprit, in your case, though.
You say you "moved many files to another repository". That implies they're deleted from one repository and added to another. The rename detection in Git compares the similarity of removed and added files. That means if you remove a huge number of files, but don't add any, there aren't any rename candidates. Similarly, if you add files but don't remove any, there aren't any rename candidates either. (That's the N^2 portion; Git has to compare every removed file against every added file to see if they meet the similarity threshold, 50% by default, to be considered a rename.)
Given you should know the commit hashes involved in the merge, you should be able to re-run the `git merge` command yourself and time it, and apply different rename limits to see whether they impact performance. Depending on the operations, though, if you're simply doing a very large merge, it's entirely possible renames were not the cause; big merges take time.
Another culprit might be your Git version. Do you happen to know what version of Git is installed on the server?
There's no way to do it via the UI. Someone with OS-level access to the server has to adjust the repository's configuration directly. (e.g. git config --int diff.renameLimit 5000) merge.renameLimit defaults to diff.renameLimit if that's set and it isn't (as shown in the link to the Git 2.13 source in my answer), so adjusting that one setting increases both for us.
Note that, in general, we recommend avoiding custom configuration--which is part of why we don't have a "simpler" approach for applying this type of change--because the impacts can be more difficult to predict. You'll want to monitor performance on your instance if you adjust the limits. We haven't seen any impacts for this change, but it's also only a moderately sized repository (~250MB) with 20-30 developers using it, on a system with otherwise very low usage; it's a dogfooding server, and only the Bitbucket Server development team really uses it.
This community is celebrating its one-year anniversary and Atlassian co-founder Mike Cannon-Brookes has all the feels.Read more
Bitbucket Pipelines helps me manage and automate a number of serverless deployments to AWS Lambda and this is how I do it. I'm building Node.js Lambda functions using node-lambda ...
Connect with like-minded Atlassian users at free events near you!Find a group
Connect with like-minded Atlassian users at free events near you!
Unfortunately there are no AUG chapters near you at the moment.Start an AUG
You're one step closer to meeting fellow Atlassian users at your local meet up. Learn more about AUGs