You've been invited into the Kudos (beta program) private group. Chat with others in the program, or give feedback to Atlassian.
View groupJoin the community to find out what other Atlassian users are discussing, debating and creating.
Currently, the pipe wants us to write full description of the task in the JSON file, which makes us specify the environment variables also. So, if the support is provided to just add the Task Definition name as the parameter to the pipe which is already created in AWS ECS
To which description field are you referring to to? Could you provide more reference to Atlassian Bitbucket Piplines YAML file documentation and/or elaborate a bit more/provide reference to Amazone Web Services in specific which Amazon Web Service in particular does provide Pipe Parameters to Atlassian Bitbucket Cloud Pipelines Plugin Pipes?
I am talking about AWS Elastic Container Service (ECS).
Please refer to the link to the pipe - https://bitbucket.org/atlassian/aws-ecs-deploy/src/1.1.4/
It requires us to define the pipeline with a parameter named - TASK_DEFINITION.
Which requires a string value, i.e. the path to the JSON file containing the task definition in full. In the task definition, we will have to provide the environment variables too. But I have already configured a TASK in AWS, so there should be an option to deploy the same task again.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Fair point for ECS to re-use the task definition in AWS already.
However how should the pipe know this? Don't get me wrong I'm a big fan of having these things straight forward, but unless (and it does not seem the case here by your feedback) the Pipe does not read that task definition from AWS ECS when it runs.
So this would be a feature request for the pipe. But I must admit I don't have any experience how (well) this works, I'd assume it will take some time.
As you can combine a Pipe with more Step script commands (before appears useful to me here, after is possible in general as well but not applicable here) maybe there is an option to make use of the AWS CLI utility (or other AWS API) in the pipeline step to fetch the existing task definition (store it as a file) and then point the Pipe to that file.
Or is that not such a good idea for your use-case? Would be a work-around for your use-case but also a good example for a feature request which then should be more easy to get done.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hey there Cloud Community members! We’re excited to give you the first glimpse of the new home for business teams on Jira — Jira Work Management. Jira Work Management is the next generation of J...
Connect with like-minded Atlassian users at free events near you!
Find an eventConnect with like-minded Atlassian users at free events near you!
Unfortunately there are no Community Events near you at the moment.
Host an eventYou're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.