Showing results for 
Search instead for 
Did you mean: 
Sign up Log in
Deleted user
0 / 0 points
badges earned

Your Points Tracker
  • Global
  • Feed

Badge for your thoughts?

You're enrolled in our new beta rewards program. Join our group to get the inside scoop and share your feedback.

Join group
Give the gift of kudos
You have 0 kudos available to give
Who do you want to recognize?
Why do you want to recognize them?
Great job appreciating your peers!
Check back soon to give more kudos.

Past Kudos Given
No kudos given
You haven't given any kudos yet. Share the love above and you'll see it here.

It's not the same without you

Join the community to find out what other Atlassian users are discussing, debating and creating.

Atlassian Community Hero Image Collage

only upload changed files to AWS S3?



would anyone know how i can only upload the changed files to aws s3 using:


at the moment it seems to upload every file, so the pipline takes much longer to run than needs be.


1 answer

0 votes

Hi @davidstanton  ,

Thank you for the question!

The aws-s3-deploy pipe uses the aws s3 sync .

It syncs directories and S3 prefixes. Recursively copies new and updated files from the source local directory to the destination. Only creates folders in the destination if they contain one or more files.  

If files stored in your S3 bucket, the pipe uploads only changed files.


is this an update then?, as V 0.4.1  didn't upload only changed files as I was using: EXTRA_ARGS: '--size-only'    to try and get around it, but that only worked half the time, as some files don't change size, even though made changes in them

nope, this still don't work as expected, it still uploads all the files and not syncs the changed files, so its worse in effect than using EXTRA_ARGS'--size-only'  which sometimes doesn't upload a file if its size doesn't change ( adding a few return lines at the bottom of a files, sort of gets around this, but it ain't really ideal)

there is also another step on this as well, if its claiming to sync only changed files and also suggests to use cache invalidation with it,  it should only then invalidate the cache for the uploaded changed files,  new files just get cached in time anyway.  Invalidating "All" which it currently does isn't a problem in itself but its the subsequent pulls that cloudfront then does, as it will pull "all" files back to cache, which costs more money etc in cloudfront charges.  No point invalidating files which don't need to be, when this latest 0.4.3 pipe "claims" to sync only updated files

Suggest an answer

Log in or Sign up to answer
Community showcase
Published in Bitbucket Pipelines

Bitbucket Pipelines Runners is now in open beta

We are excited to announce the open beta program for self-hosted runners. Bitbucket Pipelines Runners is available to everyone. Please try it and let us know your feedback. If you have any issue...

684 views 15 9
Read article

Community Events

Connect with like-minded Atlassian users at free events near you!

Find an event

Connect with like-minded Atlassian users at free events near you!

Unfortunately there are no Community Events near you at the moment.

Host an event

You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events

Events near you