Come for the products,
stay for the community

The Atlassian Community can help you and your team get more value out of Atlassian products and practices.

Atlassian Community about banner
Community Members
Community Events
Community Groups

Revert bitbucket repository state to one day prior

We have a bitbucket repository which had started showing warning messages related to size exceeding 2GB.

So I followed procedures as per following:

I have followed all the steps as per above documentation.

But somehow the repository size has now exceeded to 5.76 GB which was 2.03 GB until yesterday before following steps mentioned in documentation.

Can someone please help me revert back the repository to state one day before.

Please help me revert back the repository state to one day before. We are unable to push any new changes to the repository which is greatly impacting our project.

Kindly help.

Thank you.

2 answers

0 votes

Hi @Suhas Shinde,

I checked in our system the repositories you have access to, and I found one repo over the 4 GB limit. I ran a git gc on this repo, and its size is 2.0 GB now.

Please feel free to let me know if you are able to push your changes now.

Are you planning to reduce your repo's size further than that?

Please note that if you use either BFG or git filter-branch to remove files from the repo's history and push your changes, its size will most likely go up again. Another git gc will be needed from our side, for the remote repo's size to get reduced. Feel free to let me know if you'd like guidance on how to reduce the repo's size.

Kind regards,

Hello @Theodora Boudale

Thank you so much!

Yes, I'm now able to push new changes to the repository.

And yes we want to further reduce repo's size if its possible.

Could you kindly let me know how exactly can we further reduce repo's size, what exact steps needs to be followed, will it impact repo's performance, any particular precautions we need to follow before proceeding ahead, etc.

Kindly share all associated details so that I can take prior permissions from my seniors before proceeding ahead.

Thanks again for your help.

Hi @Suhas Shinde,

1. The first step would be to identify if you have any large and/or binary files in the repo.

You can use the following command in a clone of the repo to see its files and their respective size:

git rev-list --objects --all \
| git cat-file --batch-check='%(objecttype) %(objectname) %(objectsize) %(rest)' \
| sed -n 's/^blob //p' \
| sort --numeric-sort --key=2 \
| cut -c 1-12,41- \
| $(command -v gnumfmt || echo numfmt) --field=2 --to=iec-i --suffix=B --padding=7 --round=nearest

2. The next step is to find out if you need to version any large/binary files in the repo (track any changes with Git) or if these files don't need to be versioned.

  • If you don't need to version these files, you can remove them from the repo's history with BFG.
  • If you need to version them, you may want to consider tracking them with LFS.

I am sharing more details about both options below:

2.a. Removing files from repo's history

You can find step by step instructions on the following page about how to use BFG:

The example mentioned in the Usage section deletes blobs over 100M. However, if you scroll down in the Examples section, you can see that you can alternatively specify which files you want to be deleted. E.g. if you want to remove a file name largefile.war you would run

bfg-1.14.0.jar --delete-files largefile.war my-repo.git

where my-repo.git is the mirror clone of your repo (replace with your repo's name)

2.b. Using Git LFS for any large/binary files

If you need to version any large/binary files, you can consider migrating them to Git LFS.
Clone and fetch times will also improve if you use Git LFS for large files.

We have the following guide on how to migrate existing large files in a repo to Git LFS:

Workspaces on the Free billing plan have 1 GB of Git LFS in total (for all repos in that workspace)
Workspaces on the Standard billing plan have 5 GB of Git LFS total (for all repos in that workspace)
Workspaces on the Premium plan have 10 GB of Git LFS total (for all repos in that workspace)

If the workspace that owns your repo is on a paid bliing plan and you push Git LFS files more than the limit, then we will automatically add 100 GB of Git LFS to the workspace for $10 per month.

This is something you may need to discuss with your seniors and admin of the workspace, IF you want to use Git LFS and IF you think you'll need more than the LFS storage provided with the workspace.

Other things to take into account:

  • You can check the size of the repo before and after using the following command:
git count-objects -Hv
  • Always take a backup of the repo before you proceed with any of the changes I suggested above, in case you accidentally delete something you didn't want to and want to recover the repo to the prior state. You can take a backup by cloning with the --mirror flag:
git clone --mirror repo_url
  • Please note that both options I mentioned above (2.a and 2.b) include history rewrite. This means that the commit hashes of the repository will change. It's good to communicate this to your team so there are no surprises, and also so that they take a fresh clone of the repo after you push your changes (and avoid pushing the old history back)

  • If you have an integration with Jira Cloud, and you reference Jira issue keys in Bitbucket commit messages or PRs:

    When you rewrite history, the commit hashes will change. When you push your changes, Jira issues will reference the new commits, you don't need to take any action.

    However, they will continue to reference old commits (before history rewrite) because they are indexed in Jira database. If you don't want that, you can raise a support request with Jira team to ask if it's possible to delete indexed commits.

  • If you proceed with either of the options (2.a or 2.b), a git gc will be need on the Bitbucket repo after you push, for the old references to get removed and for the repo's size to get reduced. Since the repo is owned by a workspace on a paid billing plan, I would suggest creating a ticket with Bitbucket Cloud support team to run a git gc after you push your changes, to ensure a faster response.

    You can create a ticket via, in "What can we help you with" select "Technical issues and bugs" and then Bitbucket Cloud as product. In "What is the impact to your business?" please make sure to select Level 1 if the repo is over the limit and you are blocked from working.

We have a feature request to allow users to initiate a git gc for remote repositories:

You can add yourself as a watcher in that feature request by selecting the Start watching this issue link if you'd like to get notified on updates.

If you have any questions, please feel free to let me know.

Kind regards,

Like Vishwas likes this
0 votes

Adding @Caroline R from Atlassian Support Team

Suggest an answer

Log in or Sign up to answer
Community showcase
Published in Bitbucket

Git push size limits are coming to Bitbucket Cloud starting April 4th, 2022

Beginning on April 4th, we will be implementing push limits. This means that your push cannot be completed if it is over 3.5 GB. If you do attempt to complete a push that is over 3.5 GB, it will fail...

3,452 views 3 10
Read article

Atlassian Community Events