Showing results for 
Search instead for 
Did you mean: 
Sign up Log in
Deleted user
0 / 0 points
badges earned

Your Points Tracker
  • Global
  • Feed

Badge for your thoughts?

You're enrolled in our new beta rewards program. Join our group to get the inside scoop and share your feedback.

Join group
Give the gift of kudos
You have 0 kudos available to give
Who do you want to recognize?
Why do you want to recognize them?
Great job appreciating your peers!
Check back soon to give more kudos.

Past Kudos Given
No kudos given
You haven't given any kudos yet. Share the love above and you'll see it here.

It's not the same without you

Join the community to find out what other Atlassian users are discussing, debating and creating.

Atlassian Community Hero Image Collage

How do I share files between bitbucket pipes Edited

So I am writing my first pipe. It will generate a single file as an output. Given that the pipe runs in a container isolated from any other pipes /steps etc, how would I have another pipe take that file that was generated.


1. create-file-pipe = results in

2. push-file-pipe = takes and do something with it

In addition, I'll know the filename (as it's made up of some parameters passed in) but if I could set like an output variable that would be helpful

EDIT: Is it as simple as using a variable like OUTPUT_PATH and that then is available between steps/pipes, assuming the consumer passes it using an artifact directive?



2 answers

1 accepted

0 votes
Answer accepted

Just want to share the way I resolved this in the end. 

User passes a variable called output_folder.

I create a file to that folder in my pipe, and you can then use artifacts to hare between steps using a globbing pattern.


Hi @Mark Harrison will you share a code snippet from the pipe and the pipeline for how you did this?  

I have been trying to do the same thing and have no luck getting it to work.

Hi @Rob Jahn 

Yes of course!

The pipe I created makes use of one of our existing containers which outputs a file to the container (you can specify an optional folder)

The line in the pipe which does this looks like this:

run octo pack --id "$ID" --version "$VERSION" --format "$FORMAT" --basePath "$SOURCE_PATH" --outFolder "$OUTPUT_PATH" "${EXTRA_ARGS[@]}"

You can see the entire pipe here:

The resultant pipeline file snippet with the pipe looks like this: 

- step:
name: Pack for Octopus
- pipe: octopusdeploy/pack:0.4.0
OUTPUT_PATH: './out'
DEBUG: 'false'
- out/*.zip

You can see the whole file in our OctopusSamples repo:

So, in the end, I didn't make use of the Shared pipe directory. 

The main reason for this is that the resultant file would either be a .zip or .nupkg file and I didn't want the consumer to need to know the name of the file like was in the case for the Deploy an AWS Lambda function referenced here:

That being said, there is some drawbacks to my approach:

1. The output folder specified can't be written to more than once after it's shared as an artifact if it's created on the initial step. It appears read-only on any subsequent steps. Therefore creating an output folder in some cases is not ideal (where you want to share multiple files in a folder). I did a little bit of research and I originally tried to modify the folder permissions on subsequent steps and I believe trying to change the owner/permissions is restricted - something to do with docker user namespaces from what I could tell.

2. As a result of 1) this means that the use-case of my pipe is limited to sharing files between steps to unique folders (although you might be able to get this to work using the root directory)

Hope that helps, but I'd be happy to answer any other questions you may have :)

Thanks, Mark

Yes this helps.  There some other community questions threads showing the use of BITBUCKET_PIPE_SHARED_STORAGE_DIR which I found did not work outside the pipe.  Like you, I was able to just create some file in the pipe and grab it use that file name as the artifact.  then in a subsequent step, read in that artifact file.

Hi @Rob Jahn 

I'm glad it helped. If I can help further feel free to drop me a message here :)


0 votes

Hi @Mark Harrison , we do something similar in the example for deploying AWS Lambda functions Remember, that artifacts paths can't be dynamic, so you are not gonna be able to use environment variables when defining an atrifact.

So if I understand correctly, I would need to specify a filename such as something like this:

 # The pipe exports the newly published 
        # Lambda version to a file.
          - pipe.meta.env

where pipe.meta.env could any filename I choose as the pipe creator. Could this be used to store a dynamic filename within its contents?

So If I had a pipe that took multiple variables and that's what constructs the filename could I then use the source directive to read in the filename I want to use in subsequent step/pipe?

Is the name pipe.meta.env a special name? I cant find that filename specified anywhere in the deploy aws lambda code

I found it also unclear where this file is being created


In that lambda repo you reference, , the readme also shows this example of using the build in variable $BITBUCKET_PIPE_SHARED_STORAGE_DIR in the pipe:

then using it as a script step in the pipe....

- VERSION=$(jq --raw-output '.Version' $BITBUCKET_PIPE_SHARED_STORAGE_DIR/aws-lambda-deploy-env

which according to this -- the BITBUCKET_PIPE_SHARED_STORAGE_DIR is not usable outside the pipe

In my testing, I found the BITBUCKET_PIPE_SHARED_STORAGE_DIR did not have a value outside the pipe.   Tested using "echo "BITBUCKET_PIPE_SHARED_STORAGE_DIR  = $BITBUCKET_PIPE_SHARED_STORAGE_DIR"


At the moment, I am all set.  I am using the solution listed I mention above: "create some file in the pipe and grab it use that file name as the artifact.  Then in a subsequent step, read in that artifact file."  I don't add in a folder path.  The file just lives in the root folder with the checked out code.

Like Ihor Sviziev likes this

Suggest an answer

Log in or Sign up to answer
Community showcase
Published in Bitbucket Pipelines

Bitbucket Pipelines Runners is now in open beta

We are excited to announce the open beta program for self-hosted runners. Bitbucket Pipelines Runners is available to everyone. Please try it and let us know your feedback. If you have any issue...

836 views 15 11
Read article

Community Events

Connect with like-minded Atlassian users at free events near you!

Find an event

Connect with like-minded Atlassian users at free events near you!

Unfortunately there are no Community Events near you at the moment.

Host an event

You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events

Events near you