We have a project with about 1,300 files, totalling about 1.61GB, largest file at 11MB. About 99% of these files are binary files. We're wanting to import this project into Stash/Git. Will something like this need something specially like Git Annex, Git BigFiles, Git Fat?
Not necessarily. Although your repository is on the larger side you don't have to start looking into specialised tools just yet if the repository works well for you.
The size of the repository will affect a number of things:
Other things to think about:
In general I wouldn't consider files up to 11MB to be too large to manage with git. It gets difficult when you talk about multiples of that though (e.g. over 50MB it becomes worth looking into alternatives IMHO).
Have you tried importing it to see how it impacts your usage and the instance?
One important thing to consider is how often these binary files change. Every time a binary changes, the repo size goes up by almost the full file size (since the diff is almost the entire file).
=> If they change often, you might want to consider putting the binaries into a separate repo where you can (more easily) wipe (some of) the history to reduce the repo size. (This can be mapped as a submodule into the main repo, if necessary.)
Bitbucket Pipelines helps me manage and automate a number of serverless deployments to AWS Lambda and this is how I do it. I'm building Node.js Lambda functions using node-lambda ...
Connect with like-minded Atlassian users at free events near you!Find a group
Connect with like-minded Atlassian users at free events near you!
Unfortunately there are no AUG chapters near you at the moment.Start an AUG
We're bringing product updates and pro tips on teamwork to ten cities around the world.Save your spot