Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Ability to set max_concurrent_requests for the s3 pipe

Andrew McClenaghan June 8, 2021

When using the S3 pipe https://bitbucket.org/atlassian/aws-s3-deploy/src we need to be able to set the `default.s3.max_concurrent_requests` value as we have lots of little files so the upload is slow. 

 

The command to set it is `aws configure set default.s3.max_concurrent_requests 20` It would be great if this could be a configuration value. The default is 10.

1 answer

0 votes
Andrew McClenaghan June 9, 2021

@Halyna Berezovska Not sure if you are the right person to talk to about this. I would love to help and provide a PR if that is possible. 

Halyna Berezovska
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
June 10, 2021

@Andrew McClenaghan I am the right person :)

Thanks for raising this. Soon we are going to work on s3 to improve it and can think about additional parameters. I will include your question to the discussion and notify you once we release a new update.

Regards, Galyna

Halyna Berezovska
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
June 11, 2021

@Andrew McClenaghan also , if you want to make a contribution, feel free to do it. We are happy to see new contributors. Just follow contributing guide in the pipe and don't forget to make a changeset in the pr, that will provide pipe's version bump on merge.

Also, take into account that we are going to refactor this pipe to Python in the next update, so if you do it now, you anyway will have to rewrite this to Python.

Generally, me , I like the idea if we pass additional aws config parameters and configure these params in the pipe.

I note that it would be different from any extra parameters types, because it provides params for aws config file, but not params to pass to functions variables.

But I would discuss this within the team.

If you want to do the contribution, tell us, so we can either wait for your contribution or implement this feature ourselves.

Regards, Galyna

Andrew McClenaghan June 13, 2021

How far off do you think the change to python will be? We are having a very slow deploy at the moment so just comparing the time to fix this in the cli version vs waiting for the python version.

Halyna Berezovska
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
June 14, 2021

@Andrew McClenaghan refactoring to Python in our infrastructure is caused by other reasons, but python is much more convenient for different parameters and config things, since we can allow more flexible data structures for variables to pass in the pipe. Like this you'd have the natural ability to make proper config for you. And since it is natural, it should work faster because of it is natural and you don't have to use workarounds like this.

Regards, Galyna

Andrew McClenaghan June 14, 2021

Hi @Halyna Berezovska 

 

Thanks for that. 

 

Are you able to provide a timeline for this change? I am trying to work out if I need to do the PR on the current version or if we can wait for the move to Python.

 

Cheers,
Andrew

Halyna Berezovska
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
July 1, 2021

@Andrew McClenaghan yes, we are working on this pipe right now.

We will update  you with status in nearest weeks.

Cheers, Galyna

Halyna Berezovska
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
July 20, 2021

@Andrew McClenaghan welcome to try our newest 1.1.0 aws-s3-deploy pipe version.

We introduced preexecution scripts in the pipe.

The idea is the following: you pass the filepath like in the example Readme https://bitbucket.org/atlassian/aws-s3-deploy/src/master/

script:
  - echo 'aws configure max_queue_size 500' > .my-script.sh
  - chmod 005 .my-script.sh
  - pipe: atlassian/aws-s3-deploy:1.1.0
    variables:
      AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
      AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
      AWS_DEFAULT_REGION: 'us-east-1'
      S3_BUCKET: 'my-bucket-name/logs'
      LOCAL_PATH: '$(pwd)'
      PRE_EXECUTION_SCRIPT: '.my-script.sh'

  And we execute this inside the pipe. Like this you can do anything you want: put aws configure commands, install additional packages if you need to, etc.

 

We are looking forward to seeing the feedback from you!

Have a nice day!

Cheers, Galyna

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events