I have enabled LFS with git and Bitbucket. I am now able to push a ZIP file with more than 100MB which is fantastic, but I am a little worried I will blot Bitbucket over time. The ZIP file is an export of an application component that is deployed to a workflow platform similar to Flowable.
The questions I have are the following:
If the process flow above is problematic for Bitbucket, I am thinking to write a script that will use REST API to export the application component model from the workflow system, and unzip the file (it's all text in JSON and XML), and push the model to Bitbucket. Each ZIP file may contain literally hundreds of files (JSON and XML text files) that describe the application parts. The only problem with this approach is that it will take a long time, and there is a large number of changes that are not related to the programmatic features of the model. Such changes are to flag the metadata of each part within the component which is really not relevant at all to the programmatic change (application implementation details). I once did an experiment to make the application change to the unzipped files, zipped them back, and deployed the model to the workflow, and all was fine.
I appreciate your feedback.
Beginning on April 4th, we will be implementing push limits. This means that your push cannot be completed if it is over 3.5 GB. If you do attempt to complete a push that is over 3.5 GB, it will fail...
Connect with like-minded Atlassian users at free events near you!Find an event
Connect with like-minded Atlassian users at free events near you!
Unfortunately there are no Community Events near you at the moment.Host an event
You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events