We are starting a game development studios and have about 1T of data backups that we would like to add to a repository.
We currently have a Linux (Ubuntu) BitBucker Server running on a virtual machine - this seems to be working fine.
When I try to push commits of large sizes within source tree, we are running into memory errors - any advice on how to fix this?
For example, we will be working on high-end engines like Crytek, and they recommend checking the entire engine into source integrity which 3-6 Gigs (thousands of files) alone not including some of the large source files that we need to deal with for artist and other creatives.
The memory errors you see sounds like the large files have not been added to Git LFS, but still exist in the repository. Also, if the files were in the repository before you setup Git LFS, did you clean up the history of the repository to move those files out of it?
All new to this - sorry. This is a brand new setup - just trying to set up initial repositories from a clean install/directory. I did set LFS check mark in bitbucket - using Sourcetree as the interface (where something might not be set properly - all fresh installs (of everything! :).
Basically, there are very large backup directories with engines, source art, video etc - in various directories. We want to make this all accessible to the team.
Okay, so you probably need to add those directories to Git LFS, this command will do that:
git lfs track "<path to backup directory>/"
This will add the directory and all files under it to Git LFS. In the directories that have files added to Git LFS you will see a .gitattributes file that keeps track on files added to Git LFS. The tutorial I linked to above is really good at describing what you need to do to set everything up and how to use it.
Thank you for your help. With your help, we have finally understood how to do this correctly. The interface with SourceTree messed us up for a bit, but we found a process by jumping between command line commands and SourceTree visuals to correctly set the right files (by extensions) to LFS and set the regular text files to use GIT. We have now successfully pushed 5-6 Gigs worth of files and testing a 70 directory to bitbucket, and it seems to be working. Next, we have a 1 T directors :p.
Bitbucket Pipelines helps me manage and automate a number of serverless deployments to AWS Lambda and this is how I do it. I'm building Node.js Lambda functions using node-lambda ...
Connect with like-minded Atlassian users at free events near you!Find a group
Connect with like-minded Atlassian users at free events near you!
Unfortunately there are no AUG chapters near you at the moment.Start an AUG
We're bringing product updates and pro tips on teamwork to ten cities around the world.Save your spot