You're on your way to the next level! Join the Kudos program to earn points and save your progress.
Level 1: Seed
25 / 150 points
1 badge earned
Challenges come and go, but your rewards stay with you. Do more to earn more!
What goes around comes around! Share the love by gifting kudos to your peers.
Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!
Join now to unlock these features and more
New to the community.
I have a folder that contains number of files that take lot of space.
These files are changed and push everyday as part of the development work.
Is that a good idea to zip these files before I push them back to external repo?
And is zipping safe for files data?
Also, when I pull the external repo to my machine, will the zipped files that were pushed be in the same state?
Hey @Ashalina Z
I recommend you have a look at the git-LFS feature: https://support.atlassian.com/bitbucket-cloud/docs/use-git-lfs-with-bitbucket/
The zipping depends on the file types and used algorithm: text files size can be reduced significantly while binaries usually don't.
Zipping is usually safe, but you should still check all your envs: dev, CI, CD, etc'. Some envs might have different zip clients/versions which might affect the zip/unzipping of the files.
My personal recommendation: don't push large files to your repo without using git-lfs. These files will be kept in your history even after you delete them which will quickly inflate your repo size and will cause issues when cloning the repo. If you don't want to work with git-lfs, consider using DB or uploading the files to binary repository.