Without thinking of the consequences, I added a 15.5 GB file to a repo.
git push
said
Enumerating objects: 24, done.
Counting objects: 100% (24/24), done.
Delta compression using up to 8 threads
Compressing objects: 100% (16/16), done.
error: RPC failed; HTTP 413 curl 22 The requested URL returned error: 413
fatal: the remote end hung up unexpectedly
Writing objects: 100% (17/17), 13.93 MiB | 9.16 MiB/s, done.
Total 17 (delta 7), reused 0 (delta 0)
fatal: the remote end hung up unexpectedly
Everything up-to-date
Based on this advice I tried
git config --global http.postBuffer 157286400
I still get
Enumerating objects: 24, done.
Counting objects: 100% (24/24), done.
Delta compression using up to 8 threads
Compressing objects: 100% (16/16), done.
Writing objects: 100% (17/17), 13.93 MiB | 28.14 MiB/s, done.
Total 17 (delta 7), reused 0 (delta 0)
error: RPC failed; HTTP 413 curl 22 The requested URL returned error: 413
fatal: the remote end hung up unexpectedly
fatal: the remote end hung up unexpectedly
Everything up-to-date
I also tried removing the 15.5 GB file.
Running Bitbucket v7.11.2 in our data center.
Appreciate your advice.
Thanks
Arthur
What command did you use when you tried to remove the file, just the regular git delete? That will just remove the file from that point on, it will not remove it from the history. In order to remove the file from history you have to do this if the you added the file in your last commit:
$ git rm --cached
$ git commit --amend -CHEAD
$ git push
This will remove the cached version, change the commit and push the changed history to you repo.
Be aware that git does have issues when the file and/or repository get above 3Gb, so it is recommended to use Git LFS if you want to store large files in your repository.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thanks for your response Mikael. I used git rm.
Not surprisingly, I've made multiple commits now.
I don't mind using git LFS. I configured that in the Bitbucket repo. But "git push" still gets:
Enumerating objects: 44, done.
Counting objects: 100% (44/44), done.
Delta compression using up to 8 threads
Compressing objects: 100% (32/32), done.
Writing objects: 100% (34/34), 14.08 MiB | 27.78 MiB/s, done.
Total 34 (delta 17), reused 0 (delta 0)
error: RPC failed; HTTP 413 curl 22 The requested URL returned error: 413
fatal: the remote end hung up unexpectedly
fatal: the remote end hung up unexpectedly
Everything up-to-date
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Yeah, enabling Git LFS after committing large files will not help, unless you remove those files from history first. I have used BFG Repo cleaner in the past to clear out the repository from large files before moving them to Git LFS. Atlassian has a really good guide about Git LFS that you can find here.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You can also use Git filter-branch to remove the file. These are the commands for that:
$ git filter-branch --force --index-filter \ "git rm --cached --ignore-unmatch PATH-TO-YOUR-FILE-WITH-SENSITIVE-DATA" \ --prune-empty --tag-name-filter cat -- --all
$git push origin --force --all
This might be the fastest option if it is only one file you want to delete.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thanks Mikael, as far as I can tell, that worked perfectly! And this GitHub page reassured me that your Git filter-branch approach is well-regarded.
Regards
Arthur
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.