Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in
Deleted user
0 / 0 points
Next:
badges earned

Your Points Tracker
Challenges
Leaderboard
  • Global
  • Feed

Badge for your thoughts?

You're enrolled in our new beta rewards program. Join our group to get the inside scoop and share your feedback.

Join group
Recognition
Give the gift of kudos
You have 0 kudos available to give
Who do you want to recognize?
Why do you want to recognize them?
Kudos
Great job appreciating your peers!
Check back soon to give more kudos.

Past Kudos Given
No kudos given
You haven't given any kudos yet. Share the love above and you'll see it here.

It's not the same without you

Join the community to find out what other Atlassian users are discussing, debating and creating.

Atlassian Community Hero Image Collage

How can I deploy to a specific direcory in my S3 bucket

I'm using the aws-s3-deply pipe to deploy artifacts from my pipeline build to my S3 bucket and it works great.

(https://bitbucket.org/atlassian/aws-s3-deploy/src/master/)

I would like to be able to deploy to a sub-folder in the s3 bucket rather than the root directory.  I tried adding a subdirectory specification to the S3_BUCKET variable definition but that did not work.

 

1 answer

0 votes

Hi @3baileys ,

Thank you for your feedback!
Can you provide us with your pipelines configuration to help you?


Cheers,
Alex

Here's a minimal configuration which puts a time stamped example file on my S3 bucket in the root directory (formatting may be messed up):

image:
    name: bibrin/fruitloops

clone:
    depth: full

pipelines:

    branches:
        test:
               - step:
                   script:
                      - echo "Pipeline test" > "$(date +"%Y_%m_%d_%I_%M_%p").txt"
                      - pipe: atlassian/aws-s3-deploy:0.2.1
                         variables:
                               AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
                               AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
                               AWS_DEFAULT_REGION: "us-east-1"
                               S3_BUCKET: "s3-bucket-name"
                               LOCAL_PATH: "$(pwd)"
                      artifacts:
                         - 2020*.txt

 

What I'd like to do is put the file into a subdirectory.  Perhaps by setting a REMOTE_PATH: "subdir" variable?

@3baileys ,

S3_BUCKET parameter can be 's3-bucket-name/logs' and files will store in the "logs" of s3-bucket-name:

script:
- echo "Pipeline test" > "$(date +"%Y_%m_%d_%I_%M_%p").txt"
- pipe: atlassian/aws-s3-deploy:0.3.5
variables:
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: 'us-east-1'
S3_BUCKET: 's3-bucket-name/logs'
LOCAL_PATH: $(pwd)
EXTRA_ARGS: "--exclude=* --include=2020*.txt"

Also, better to use the newest pipe's version.

Like # people like this

@Oleksandr Kyrdanis it a possibility to update the README.md in the pipes repository?

Like Oleksandr Kyrdan likes this

@Oleksandr KyrdanI've got a branch ready for a PR but unfortunately I've got no rights to push anything.

Hi @rickvschalkwijk ,

good suggestion, thank you! We'll update the Readme for the aws-s3-deploy pipe.

Suggest an answer

Log in or Sign up to answer
TAGS
Community showcase
Published in Bitbucket Pipelines

Bitbucket Pipelines Runners is now in open beta

We are excited to announce the open beta program for self-hosted runners. Bitbucket Pipelines Runners is available to everyone. Please try it and let us know your feedback. If you have any issue...

777 views 15 10
Read article

Community Events

Connect with like-minded Atlassian users at free events near you!

Find an event

Connect with like-minded Atlassian users at free events near you!

Unfortunately there are no Community Events near you at the moment.

Host an event

You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events

Events near you