Are you in the loop? Keep up with the latest by making sure you're subscribed to Community Announcements. Just click Watch and select Articles.

×
Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in
Celebration

Earn badges and make progress

You're on your way to the next level! Join the Kudos program to earn points and save your progress.

Deleted user Avatar
Deleted user

Level 1: Seed

25 / 150 points

Next: Root

Avatar

1 badge earned

Collect

Participate in fun challenges

Challenges come and go, but your rewards stay with you. Do more to earn more!

Challenges
Coins

Gift kudos to your peers

What goes around comes around! Share the love by gifting kudos to your peers.

Recognition
Ribbon

Rise up in the ranks

Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!

Leaderboard

Bitbucket Pipelines how to copy full repo to S3

I'm using aws-s3-deploy to setup pipelines for my bitbucket repo. I want to configure the pipeline's LOCAL_PATH variable to copy everything in my repo to S3. I have several folders in the top level of my repo, so I want to be able to copy that full top level, preserving the top level folders and their hierarchy. My bitbucket-pipelines.yml file is setup as such: 

image: python:3.5.1
pipelines:

  branches:

    master:

      - step:

        script:

          - pipe: atlassian/aws-s3-deploy:0.2.1

            variables:

              AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID

              AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY

              AWS_DEFAULT_REGION: "us-east-1"

              S3_BUCKET: $s3_bucket

              LOCAL_PATH: <I want this to be everything in repo>

              #DELETE_FLAG: "true"

 

1 answer

1 vote
Philip Hodder
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
Mar 03, 2019

Hi Annelise,

Pipelines copies the entire contents of your repository for each build. So you should be able to use the BITBUCKET_CLONE_DIR variable in order to indicate the files to update.

image: python:3.5.1
pipelines:
branches:
master:
- step:
script:
- pipe: atlassian/aws-s3-deploy:0.2.1
variables:
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: "us-east-1"
S3_BUCKET: $s3_bucket
LOCAL_PATH: $BITBUCKET_CLONE_DIR # This is the base directory of your repository.
#DELETE_FLAG: "true"

Thanks,

Phil

Hi Guys, i'm trying to sent my bitbucket repository to S3 bucket.. do you have any new information for this?

@Philip Hodder Can I use OIDC role instead of AWS keys ? Like following. I see that I can use OIDC for a lot of different types of deployments but for s3, it gives me an error: "AWS_ACCESS_KEY_ID: AWS_ACCESS_KEY_ID variable missing."

Do you not support it for s3 deployment ?

- step:
name: "Deploy to S3"
oidc: true
script:
- pipe: atlassian/aws-s3-deploy:0.3.8
variables:
AWS_DEFAULT_REGION: 'us-east-1'
AWS_OIDC_ROLE_ARN: $BITBUCKET_ACCESS_ROLE
S3_BUCKET: 's3://bitbucket-s3-assets/'
LOCAL_PATH: 'artifact'
ACL: 'public-read'

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events