You're on your way to the next level! Join the Kudos program to earn points and save your progress.
Level 1: Seed
25 / 150 points
1 badge earned
Challenges come and go, but your rewards stay with you. Do more to earn more!
What goes around comes around! Share the love by gifting kudos to your peers.
Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!
Join now to unlock these features and more
There are several versions of this question in this forum and they all end up pointing to the same page in the KB here.
In our case, we have a specific set of files that we want to have on the repo so all team members get the same files but once a new version is available, we don't really need to ever go back to the previous version.
Is there a simple way to do this automatically or with a rule based system so we keep the repo size in check, please?
The purpose of a DVCS system like Bitbucket Cloud is to preserve file history and allow you to track file history and roll back files to previous states via their commit history. Therefore, the feature you have suggested is not present in the Bitbucket Cloud UI.
Unfortunately, converting and deleting files using the BFG tool is a manual process (although you could potentially automate this by writing a bash script) - and a gc would still need to be performed in the backend to reduce the size (either automatically once the garbage threshold is reached or manually via a support request).
In general, it is recommended that large binary files are not stored in the commit history at all and that these are instead tracked in GIT LFS storage (which sits outside the main repository and does not contribute to the size every time a new commit is added). This is why that documentation linked is often suggested.
Please let me know if you need further information.
- Ben (Bitbucket Cloud Support)