Bitbucket Pipes recipes: December 2020 here
Question: How to notify with datadog send event pipe if build fails?
Recipe: You can use $BITBUCKET_EXIT_CODE variable in the after-script section.
$BITBUCKET_EXIT_CODE - the exit code of a step, if step success value setup to 0.
- <some build logic>
- if [[ $BITBUCKET_EXIT_CODE -ne 0 ]]; then ALERT_TYPE="error" ; fi
- pipe: atlassian/datadog-send-event:1.1.2
Question: How to deploy to a sub-folder in the AWS S3 bucket rather than the root directory?
Recipe: aws-s3-deploy Pipes’s S3_BUCKET parameter can be 's3-bucket-name/logs' and files will store in the "logs" of s3-bucket-name:
- echo "Pipeline test" > "$(date +"%Y_%m_%d_%I_%M_%p").txt"
- pipe: atlassian/aws-s3-deploy:0.3.5
EXTRA_ARGS: "--exclude=* --include=2020*.txt"
Question: Do we have a pipeline example for Elastic beanstalk multi-container docker deployment ?
Recipe: Working example for deploying multicontainer Docker environment as an AWS Elastic Beanstalk application to AWS Cloud using aws-elasticbeanstalk-deploy pipe you can find in the example repository bitbucketpipelines/example-aws-elasticbeanstalk-deploy-docker-multicontainer
Question: When I deploy my files with ftp-deploy, it uses lots of minutes? How to optimize it?
Recipe: Use pipe rsync-deploy for your case.
According to Rsync man page:
Rsync is famous for its delta-transfer algorithm, which reduces the amount of data sent over the network by sending only the differences between the source files and the existing files in the destination.
Rsync finds files that need to be transferred using a "quick check" algorithm (by default) that looks for files that have changed in size or in last-modified time. Any changes in the other preserved attributes (as requested by options) are made on the destination file directly when the quick check indicates that the file’s data does not need to be updated.
- pipe: atlassian/rsync-deploy:0.4.2
Question: I'm trying to follow the guide to manage multiple repository ssh keys . I've added my ssh as a secure variable, encoded in base64. But how do I decode it to pass it into the pipe's variables (parameters)?
Recipe: You should only provide encoded SSH_KEY as a variable and pipe will decode it to the ~/.ssh/pipelines_id automatically.
More recipes you could find by following the tag recipe in the Atlassian Community.
Oleksandr KyrdanAtlassian Team
We are excited to announce the open beta program for self-hosted runners. Bitbucket Pipelines Runners is available to everyone. Please try it and let us know your feedback. If you have any issue...
Connect with like-minded Atlassian users at free events near you!Find an event
Connect with like-minded Atlassian users at free events near you!
Unfortunately there are no Community Events near you at the moment.Host an event
You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events