Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

It's not the same without you

Join the community to find out what other Atlassian users are discussing, debating and creating.

Atlassian Community Hero Image Collage

Pipe failing to pull from S3 despite provided credentials having full admin permissions

Edited

When attempting to use https://bitbucket.org/atlassian/aws-cloudformation-deploy/src/0.7.4/ in order to deploy a Cloudformation stack, it consistently fails stating an S3 authentication issue, despite the fact that the AWS credentials provided to use in the repository variables have full admin rights.

I have attempted this with:

  • credentials being passively present in the environment 
  • credentials explicitly provided to the pip variables
  • https://s3.amazonaws... path to the template
  • https://<BUCKET_NAME>.s3-<REGION>.amazonaws.com/... path to the template
  • s3://... path to the template

Any file path other than the first example fails with a different message, citing errors in the path format provided.

I have also checked the S3 bucket's ACL to confirm that the AWS IAM user relevant to this task is authorised to perform actions on the bucket (it is)

I have also confirmed that the URL in use within the pipeline is the same as the URL supplied within the AWS Console for that file (it is)

I am also uploading files to the same bucket (in fact in this particular case the same files I'm then attempting to read) in a previous step in the pipeline - same bucket, same IAM user, same files.

Error message (edited to remove precise bucket information):

Status: Downloaded newer image for bitbucketpipelines/aws-cloudformation-deploy:0.7.4
INFO: Found credentials in environment variables.INFO: Using stack template from https://<BUCKET_NAME>.s3-<REGION>.amazonaws.com/... for deploy.
INFO: Validating the template.
✖ Template validation failed.
An error occurred (ValidationError) when calling the ValidateTemplate operation: S3 error: Access DeniedFor more information check http://docs.aws.amazon.com/AmazonS3/latest/API/ErrorResponses.html

 Is there any guidance any would be able to provide for this issue? 

1 answer

1 accepted

0 votes
Answer accepted

Hello @Robert Williams !

I see possible root causes here and can propose some troubleshooting to find the issue:

  •  sometimes aws makes user inactive (perhaps when it has actually been inactive for some time), check that this s3 user is active (we have similar errors due to this).
  • if termplate is url to s3 it shopuld have appropriate format, check you have the right path to your s3 bucket
  • s3 specifically may have complex policy for each bucket. Check bucket's policy

Regards, Galyna

I am also uploading files to the same bucket (in fact in this particular case the same files I'm then attempting to read) in a previous step in the pipeline - same bucket, same IAM user, same files.

About the fact that you are uploading same files before: check specifically that file after uploading what permissions it has and do you able to read it or not

Hi @Galyna Zholtkevych, thanks for getting back to me

Unfortunately these initial issues are things that I have already ruled out (responses to your points from my original post):

  •  sometimes aws makes user inactive (perhaps when it has actually been inactive for some time), check that this s3 user is active (we have similar errors due to this).
    • I am also uploading files to the same bucket (in fact in this particular case the same files I'm then attempting to read) in a previous step in the pipeline - same bucket, same IAM user, same files.
    • if termplate is url to s3 it shopuld have appropriate format, check you have the right path to your s3 bucket
    https://s3.amazonaws.com/{S3_BUCKET}/<path_to_template>/<template>.json
    • I have also confirmed that the URL in use within the pipeline is the same as the URL supplied within the AWS Console for that file (it is)
  • s3 specifically may have complex policy for each bucket. Check bucket's policy
    • I have also checked the S3 bucket's ACL to confirm that the AWS IAM user relevant to this task is authorised to perform actions on the bucket (it is)

However, I think you may well have solved it with your additional message - the uploaded files (bizarrely) don't seem to have been attributed with access permissions for the host account.

I will attempt to upload these files using the relevant S3 Deploy pipe to see if that resolves the issue, but I can't help but feel as though there's a bug at play somewhere along the lines, as uploading files using AWS CLI from my local machine does not result in this side-effect.

@Robert Williams yes, perhaps playing with permissions will result in something.

 The goal of the pipe is to just use the template provided.

So if this is a bug, I think this is more related either to the pipeline, not pipe or even to AWS CLI (perhaps, they have some edge cases that we need to discover).

Looking forward to get updates from you, and we can define the root cause further to see which side it is on.

@Galyna Zholtkevych Yes, I'm not sure exactly where the fault lies, but something about the combination of the AWS CLI and the pipeline environment is breaking down given the lack of access permissions granted to the file once it hits S3.

Regardless, thank you for helping me solve the issue! I would never have thought to check the individual file permissions as, again, this is not something I have ever seen happening, so you certainly saved me a whole lot of debugging time!

 

Best, 

 

Rob

@Robert Williams sure, I am glad to help.

Also you can try to keep the interal between these actions and upload file in separate step. You can also check out how we upload a template just before running cloudformation pipe https://bitbucket.org/atlassian/aws-cloudformation-deploy/src/cf0095cbe8c547e233f52c223142d057fee3f3a3/bitbucket-pipelines.yml#lines-19

Here, separating these steps may also help

@Galyna Zholtkevych Yes, I have already done so - they were separate steps while I was still using AWS CLI as well, that just gave me the option to specify particular files and their destination filepaths with more control, hence attempting the non-pipe method initially, but the pipe serves the relevant purpose now that I have revised the directory structure both within the repository and within the destination S3 bucket.

Suggest an answer

Log in or Sign up to answer
TAGS
Community showcase
Published in Confluence Cloud

🎨 Add some visual life to your templates

Hi Atlassian Community, My name is Avni Barman, and I am a Product Manager on the Confluence Cloud team. Based on feedback from you, we are giving admins more power to create templates that a...

74 views 1 1
Read article

Community Events

Connect with like-minded Atlassian users at free events near you!

Find an event

Connect with like-minded Atlassian users at free events near you!

Unfortunately there are no Community Events near you at the moment.

Host an event

You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events

Events near you