Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Is it possible to store the output of a pipe into a variable?

Rob Jahn
Marketplace Partner
Marketplace Partners provide apps and integrations available on the Atlassian Marketplace that extend the power of Atlassian products.
March 9, 2020

This is a general question, but specifically I want to use this pipe atlassian/aws-eks-kubectl-run to run this command.....

kubectl get cm -n keptn keptn-domain -ojsonpath={.data.app_domain}

....and store that output into a variable for use in a another pipeline step.

Is this possible?  Can you show an example?

3 answers

1 accepted

0 votes
Answer accepted
Alexander Zhukov
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
March 10, 2020

Hi @Rob Jahn , I don't think this is possible with the current version of the pipe. We'll try to add such feature in the nearest future and let you know.

Rob Jahn
Marketplace Partner
Marketplace Partners provide apps and integrations available on the Atlassian Marketplace that extend the power of Atlassian products.
March 10, 2020

I am also developing custom pipes.  Is it possible in general and if yes, is there an example you can point me to?

Alexander Zhukov
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
March 11, 2020

Yes, something similar is done in the aws-lambda-deploy pipe. It uses the artifacts feature to store the intermediate data between steps. You can see this guide as an example: https://confluence.atlassian.com/bitbucket/deploying-a-lambda-function-update-to-aws-967319469.html

Rob Jahn
Marketplace Partner
Marketplace Partners provide apps and integrations available on the Atlassian Marketplace that extend the power of Atlassian products.
March 11, 2020

Thanks Alex, but I am looking to see the the pipe itself can provide outputs. Use that lambda example, say the lambda returned a Json string. I would want to put that into a variable to then access and manipulate in a secondary pipeline step like a Unix command script step. 

Not sure if supported in general with pipes. For the kubectl pipe I would like to get output into string, for example from a kubectl get. And for my custom pipes, i want to expose outputs for consumption 

Rob Jahn
Marketplace Partner
Marketplace Partners provide apps and integrations available on the Atlassian Marketplace that extend the power of Atlassian products.
March 16, 2020

@Alexander Zhukov - Hi Alex, just following up on my last post.

Alexander Zhukov
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
March 16, 2020

@Rob Jahn I don't think it's currently possible to store the output into a variable. The only way is to store the output of the pipe as an artifact and parse that artifact in the subsequent steps.

Rob Jahn
Marketplace Partner
Marketplace Partners provide apps and integrations available on the Atlassian Marketplace that extend the power of Atlassian products.
March 17, 2020

Is there an example for this or just a code snippet example for what is needed in the pipe code for this that you can share or point me to? 

Rob Jahn
Marketplace Partner
Marketplace Partners provide apps and integrations available on the Atlassian Marketplace that extend the power of Atlassian products.
March 17, 2020

I think I know what to do, but anything specific for the path of the file to write out? My custom pipe is Unix shell script.

Rob Jahn
Marketplace Partner
Marketplace Partners provide apps and integrations available on the Atlassian Marketplace that extend the power of Atlassian products.
March 20, 2020

What I did was to create some file in the pipe and grab it use that file name as the artifact.  Then in a subsequent step, read in that artifact file.  I don't add in a folder path.  The file just lives in the root folder with the checked out code.

See the discussion on this thread, for it was a similar topic: https://community.atlassian.com/t5/Bitbucket-Pipelines-questions/How-do-I-share-files-between-bitbucket-pipes/qaq-p/1321475

0 votes
tomasz_krzyzanowski September 8, 2023

@Alexander ZhukovIt would be nice to have, in general, a mechanism like the one developed by Github Actions:
https://docs.github.com/en/actions/learn-github-actions/variables#passing-values-between-steps-and-jobs-in-a-workflow

 

In short - There is an file in filesystem, where are stored the environments variables. The path to the file is stored in GITHUB_ENV environment variable. And do define shared env you just need to append your env variable like:
```bash

echo "NEW_ENV=value" >> $GITHUB_EVN
```

https://docs.github.com/en/actions/using-workflows/workflow-commands-for-github-actions#setting-an-environment-variable


The file can be sourced automatically before the next script will be run and it can be mounted in some way to pipe to share the values as env variables

0 votes
yuri.ritvin February 3, 2021

We had the same issue.

The general problem we had, when we are deploying to production we need to fetch the version from staging kubernetes and deploy the same version to production.

Sound easy oh?

The ideal solution will be

- pipe: atlassian/aws-eks-kubectl-run:1.2.3
variables:
CLUSTER_NAME: "dev"
KUBECTL_COMMAND: -n amplio-staging get pods --selector=app=amplio -o jsonpath='{.items[0].spec.containers[*].image}' > ./version.txt

However, it's not possible in bitbucket pipelines without dedicated support in the pipe.

 

We ended with

image: bitbucketpipelines/aws-eks-kubectl-run:1.2.3
script:
- export CLUSTER_NAME="dev"
- export KUBECTL_COMMAND="get pods --selector=app=amplio -n amplio-staging -o jsonpath='{.items[0].spec.containers[*].image}'"
- python /pipe.py > version.txt
- cat version.txt
artifacts:
- version.txt

Instead of using a pipe, we decide to take the docker itself and run the script by hand.

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events