I have followed the instructions in:
To have my build artifacts link to the build results stored in S3 buckets and it works...
But my problem is I don't want to set those s3 objects to public access as we want to maintain those artifacts private... So if I click on the link an access denied message appears and if I change permissions to public then it works...
Could I somehow make the request URL use a key pair created in AWS IAM so these links only worked when opened through bitbucket as a logged in user? Could it maybe use enviroment variables defined to do the authentication?
As a side note. I have seen that aws s3 is directly used in some cases (for web deployment to S3 for example) and in others the helper s3_upload.py script... Which method should be prefered in general? Shouldn't be documentation unified (I find easier using awscli than adding a script to my repo)
Bitbucket Pipelines helps me manage and automate a number of serverless deployments to AWS Lambda and this is how I do it. I'm building Node.js Lambda functions using node-lambda ...
Connect with like-minded Atlassian users at free events near you!Find a group
Connect with like-minded Atlassian users at free events near you!
Unfortunately there are no AUG chapters near you at the moment.Start an AUG
You're one step closer to meeting fellow Atlassian users at your local meet up. Learn more about AUGs
We're bringing product updates and pro tips on teamwork to ten cities around the world.Save your spot