There are several versions of this question in this forum and they all end up pointing to the same page in the KB here.
In our case, we have a specific set of files that we want to have on the repo so all team members get the same files but once a new version is available, we don't really need to ever go back to the previous version.
Is there a simple way to do this automatically or with a rule based system so we keep the repo size in check, please?
Thank you.
Hi Anthony,
The purpose of a DVCS system like Bitbucket Cloud is to preserve file history and allow you to track file history and roll back files to previous states via their commit history. Therefore, the feature you have suggested is not present in the Bitbucket Cloud UI.
Unfortunately, converting and deleting files using the BFG tool is a manual process (although you could potentially automate this by writing a bash script) - and a gc would still need to be performed in the backend to reduce the size (either automatically once the garbage threshold is reached or manually via a support request).
In general, it is recommended that large binary files are not stored in the commit history at all and that these are instead tracked in GIT LFS storage (which sits outside the main repository and does not contribute to the size every time a new commit is added). This is why that documentation linked is often suggested.
Please let me know if you need further information.
Cheers!
- Ben (Bitbucket Cloud Support)
Thank you!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Online forums and learning are now in one easy-to-use experience.
By continuing, you accept the updated Community Terms of Use and acknowledge the Privacy Policy. Your public name, photo, and achievements may be publicly visible and available in search engines.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.