Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

It's not the same without you

Join the community to find out what other Atlassian users are discussing, debating and creating.

Atlassian Community Hero Image Collage

How can I deploy to a specific direcory in my S3 bucket

I'm using the aws-s3-deply pipe to deploy artifacts from my pipeline build to my S3 bucket and it works great.

(https://bitbucket.org/atlassian/aws-s3-deploy/src/master/)

I would like to be able to deploy to a sub-folder in the s3 bucket rather than the root directory.  I tried adding a subdirectory specification to the S3_BUCKET variable definition but that did not work.

 

1 answer

0 votes

Hi @3baileys ,

Thank you for your feedback!
Can you provide us with your pipelines configuration to help you?


Cheers,
Alex

Here's a minimal configuration which puts a time stamped example file on my S3 bucket in the root directory (formatting may be messed up):

image:
    name: bibrin/fruitloops

clone:
    depth: full

pipelines:

    branches:
        test:
               - step:
                   script:
                      - echo "Pipeline test" > "$(date +"%Y_%m_%d_%I_%M_%p").txt"
                      - pipe: atlassian/aws-s3-deploy:0.2.1
                         variables:
                               AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
                               AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
                               AWS_DEFAULT_REGION: "us-east-1"
                               S3_BUCKET: "s3-bucket-name"
                               LOCAL_PATH: "$(pwd)"
                      artifacts:
                         - 2020*.txt

 

What I'd like to do is put the file into a subdirectory.  Perhaps by setting a REMOTE_PATH: "subdir" variable?

@3baileys ,

S3_BUCKET parameter can be 's3-bucket-name/logs' and files will store in the "logs" of s3-bucket-name:

script:
- echo "Pipeline test" > "$(date +"%Y_%m_%d_%I_%M_%p").txt"
- pipe: atlassian/aws-s3-deploy:0.3.5
variables:
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: 'us-east-1'
S3_BUCKET: 's3-bucket-name/logs'
LOCAL_PATH: $(pwd)
EXTRA_ARGS: "--exclude=* --include=2020*.txt"

Also, better to use the newest pipe's version.

Like rickvschalkwijk likes this

@Oleksandr Kyrdanis it a possibility to update the README.md in the pipes repository?

Like Oleksandr Kyrdan likes this

@Oleksandr KyrdanI've got a branch ready for a PR but unfortunately I've got no rights to push anything.

Hi @rickvschalkwijk ,

good suggestion, thank you! We'll update the Readme for the aws-s3-deploy pipe.

Suggest an answer

Log in or Sign up to answer
TAGS
Community showcase
Published in Bitbucket Pipelines

What We Learned When We Researched Open Source Vulnerabilities in 7 Popular Coding Languages

...hey are a part of us, shaping how we interact with the world around us. The same holds true for programming languages when we think about how different kinds of vulnerabilities raise their heads in t...

1,057 views 0 3
Read article

Community Events

Connect with like-minded Atlassian users at free events near you!

Find an event

Connect with like-minded Atlassian users at free events near you!

Unfortunately there are no Community Events near you at the moment.

Host an event

You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events

Events near you