You're on your way to the next level! Join the Kudos program to earn points and save your progress.
Level 1: Seed
25 / 150 points
Next: Root
1 badge earned
Challenges come and go, but your rewards stay with you. Do more to earn more!
What goes around comes around! Share the love by gifting kudos to your peers.
Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!
Join now to unlock these features and more
I'm using aws-s3-deploy to setup pipelines for my bitbucket repo. I want to configure the pipeline's LOCAL_PATH variable to copy everything in my repo to S3. I have several folders in the top level of my repo, so I want to be able to copy that full top level, preserving the top level folders and their hierarchy. My bitbucket-pipelines.yml file is setup as such:
image: python:3.5.1
pipelines:
branches:
master:
- step:
script:
- pipe: atlassian/aws-s3-deploy:0.2.1
variables:
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: "us-east-1"
S3_BUCKET: $s3_bucket
LOCAL_PATH: <I want this to be everything in repo>
#DELETE_FLAG: "true"
Hi Annelise,
Pipelines copies the entire contents of your repository for each build. So you should be able to use the BITBUCKET_CLONE_DIR variable in order to indicate the files to update.
image: python:3.5.1
pipelines:
branches:
master:
- step:
script:
- pipe: atlassian/aws-s3-deploy:0.2.1
variables:
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: "us-east-1"
S3_BUCKET: $s3_bucket
LOCAL_PATH: $BITBUCKET_CLONE_DIR # This is the base directory of your repository.
#DELETE_FLAG: "true"
Thanks,
Phil
Hi Guys, i'm trying to sent my bitbucket repository to S3 bucket.. do you have any new information for this?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
@Philip Hodder Can I use OIDC role instead of AWS keys ? Like following. I see that I can use OIDC for a lot of different types of deployments but for s3, it gives me an error: "AWS_ACCESS_KEY_ID: AWS_ACCESS_KEY_ID variable missing."
Do you not support it for s3 deployment ?
- step:
name: "Deploy to S3"
oidc: true
script:
- pipe: atlassian/aws-s3-deploy:0.3.8
variables:
AWS_DEFAULT_REGION: 'us-east-1'
AWS_OIDC_ROLE_ARN: $BITBUCKET_ACCESS_ROLE
S3_BUCKET: 's3://bitbucket-s3-assets/'
LOCAL_PATH: 'artifact'
ACL: 'public-read'
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.