Showing results for 
Search instead for 
Did you mean: 
Sign up Log in
Deleted user
0 / 0 points
badges earned

Your Points Tracker
  • Global
  • Feed

Badge for your thoughts?

You're enrolled in our new beta rewards program. Join our group to get the inside scoop and share your feedback.

Join group
Give the gift of kudos
You have 0 kudos available to give
Who do you want to recognize?
Why do you want to recognize them?
Great job appreciating your peers!
Check back soon to give more kudos.

Past Kudos Given
No kudos given
You haven't given any kudos yet. Share the love above and you'll see it here.

It's not the same without you

Join the community to find out what other Atlassian users are discussing, debating and creating.

Atlassian Community Hero Image Collage

Bitbucket Pipes recipes: December 2020

Bitbucket Pipes recipes: December 2020 here


Question: How to notify with datadog send event pipe if build fails?
Recipe: You can use $BITBUCKET_EXIT_CODE variable in the after-script section.
$BITBUCKET_EXIT_CODE - the exit code of a step, if step success value setup to 0.

- <some build logic>
- ALERT_TYPE="success"
- if [[ $BITBUCKET_EXIT_CODE -ne 0 ]]; then ALERT_TYPE="error" ; fi
- pipe: atlassian/datadog-send-event:1.1.2


Question: How to deploy to a sub-folder in the AWS S3 bucket rather than the root directory?
Recipe: aws-s3-deploy Pipes’s S3_BUCKET parameter can be 's3-bucket-name/logs' and files will store in the "logs" of s3-bucket-name:

- echo "Pipeline test" > "$(date +"%Y_%m_%d_%I_%M_%p").txt"
- pipe: atlassian/aws-s3-deploy:0.3.5
S3_BUCKET: 's3-bucket-name/logs'
LOCAL_PATH: $(pwd)
EXTRA_ARGS: "--exclude=* --include=2020*.txt"


Question: Do we have a pipeline example for Elastic beanstalk multi-container docker deployment ?
Recipe: Working example for deploying multicontainer Docker environment as an AWS Elastic Beanstalk application to AWS Cloud using aws-elasticbeanstalk-deploy pipe you can find in the example repository bitbucketpipelines/example-aws-elasticbeanstalk-deploy-docker-multicontainer


Question: When I deploy my files with ftp-deploy, it uses lots of minutes? How to optimize it?
Recipe: Use pipe rsync-deploy for your case.

According to Rsync man page:

Rsync is famous for its delta-transfer algorithm, which reduces the amount of data sent over the network by sending only the differences between the source files and the existing files in the destination.

Rsync finds files that need to be transferred using a "quick check" algorithm (by default) that looks for files that have changed in size or in last-modified time. Any changes in the other preserved attributes (as requested by options) are made on the destination file directly when the quick check indicates that the file’s data does not need to be updated.

- pipe: atlassian/rsync-deploy:0.4.2
USER: 'ec2-user'
REMOTE_PATH: '/var/www/build/'
LOCAL_PATH: 'build'


Question: I'm trying to follow the guide to manage multiple repository ssh keys . I've added my ssh as a secure variable, encoded in base64. But how do I decode it to pass it into the pipe's variables (parameters)?
Recipe: You should only provide encoded SSH_KEY as a variable and pipe will decode it to the ~/.ssh/pipelines_id automatically.


More recipes you could find by following the tag recipe in the Atlassian Community.




Log in or Sign up to comment
Community showcase
Published in Bitbucket Pipelines

Bitbucket Pipelines Runners is now in open beta

We are excited to announce the open beta program for self-hosted runners. Bitbucket Pipelines Runners is available to everyone. Please try it and let us know your feedback. If you have any issue...

2,479 views 54 18
Read article

Community Events

Connect with like-minded Atlassian users at free events near you!

Find an event

Connect with like-minded Atlassian users at free events near you!

Unfortunately there are no Community Events near you at the moment.

Host an event

You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events

Events near you