Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in
Celebration

Earn badges and make progress

You're on your way to the next level! Join the Kudos program to earn points and save your progress.

Deleted user Avatar
Deleted user

Level 1: Seed

25 / 150 points

Next: Root

Avatar

1 badge earned

Collect

Participate in fun challenges

Challenges come and go, but your rewards stay with you. Do more to earn more!

Challenges
Coins

Gift kudos to your peers

What goes around comes around! Share the love by gifting kudos to your peers.

Recognition
Ribbon

Rise up in the ranks

Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!

Leaderboard

Come for the products,
stay for the community

The Atlassian Community can help you and your team get more value out of Atlassian products and practices.

Atlassian Community about banner
4,558,901
Community Members
 
Community Events
184
Community Groups

How do I push only the changed files to S3?

So when files are being edit or changes are made and edit in a bitbucket repository, how can I push only the changes, is there a way or pipeline to do so?

1 answer

Mike Green, hi. It's a good case to use pipe: aws-s3-deploy, which under the hood run: aws s3 sync command.

 

Try this in your pipeline configuration:

script:
  - pipe: atlassian/aws-s3-deploy:1.1.0
    variables:
      AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
      AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
      AWS_DEFAULT_REGION: 'us-east-1'
      S3_BUCKET: 'my-bucket-name'
      LOCAL_PATH: 'build'

Cheers.

This pipe only push all the files to S3 bucket, but we only want to push the changed files only to the S3 bucket. Is there a way to do that?

According to docs i provided previously: aws s3 sync recursively copies new and updated files from the source directory to the destination.

This pipe logic:

run aws s3 sync ${LOCAL_PATH} s3://${S3_BUCKET}/ ${ARGS_STRING[@]} ${AWS_DEBUG_ARGS}

That may be true, but when using it in Bitbucket Pipelines, the files don't retain their date/time from the code repo.  The file always gets the current date/time, and so AWS S3 sync will upload *all* the files (because all the files have a newer file timestamp than what's on S3).  This is easily tested doing a "ls -laR" in the pipeline and running the pipeline more than once, a few minutes apart.

 

So what we need is a way for Bitbucket Pipelines to preserve the files' date/time, instead of always reflecting the current date/time.

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events