would anyone know how i can only upload the changed files to aws s3 using:
at the moment it seems to upload every file, so the pipline takes much longer to run than needs be.
Hi @davidstanton ,
Thank you for the question!
The aws-s3-deploy pipe uses the aws s3 sync .
It syncs directories and S3 prefixes. Recursively copies new and updated files from the source local directory to the destination. Only creates folders in the destination if they contain one or more files.
If files stored in your S3 bucket, the pipe uploads only changed files.
nope, this still don't work as expected, it still uploads all the files and not syncs the changed files, so its worse in effect than using EXTRA_ARGS: '--size-only' which sometimes doesn't upload a file if its size doesn't change ( adding a few return lines at the bottom of a files, sort of gets around this, but it ain't really ideal)
there is also another step on this as well, if its claiming to sync only changed files and also suggests to use cache invalidation with it, it should only then invalidate the cache for the uploaded changed files, new files just get cached in time anyway. Invalidating "All" which it currently does isn't a problem in itself but its the subsequent pulls that cloudfront then does, as it will pull "all" files back to cache, which costs more money etc in cloudfront charges. No point invalidating files which don't need to be, when this latest 0.4.3 pipe "claims" to sync only updated files
We are excited to announce the open beta program for self-hosted runners. Bitbucket Pipelines Runners is available to everyone. Please try it and let us know your feedback. If you have any issue...
Connect with like-minded Atlassian users at free events near you!Find an event
Connect with like-minded Atlassian users at free events near you!
Unfortunately there are no Community Events near you at the moment.Host an event
You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events