Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Bitbucket Pipes recipes: December 2020

Bitbucket Pipes recipes: December 2020 here

 

Question: How to notify with datadog send event pipe if build fails?
Recipe: You can use $BITBUCKET_EXIT_CODE variable in the after-script section.
$BITBUCKET_EXIT_CODE - the exit code of a step, if step success value setup to 0.

script:
- <some build logic>
after-script:
- ALERT_TYPE="success"
- if [[ $BITBUCKET_EXIT_CODE -ne 0 ]]; then ALERT_TYPE="error" ; fi
- pipe: atlassian/datadog-send-event:1.1.2
variables:
API_KEY: $API_KEY
ALERT_TYPE: $ALERT_TYPE

 

Question: How to deploy to a sub-folder in the AWS S3 bucket rather than the root directory?
Recipe: aws-s3-deploy Pipes’s S3_BUCKET parameter can be 's3-bucket-name/logs' and files will store in the "logs" of s3-bucket-name:

script:
- echo "Pipeline test" > "$(date +"%Y_%m_%d_%I_%M_%p").txt"
- pipe: atlassian/aws-s3-deploy:0.3.5
variables:
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: 'us-east-1'
S3_BUCKET: 's3-bucket-name/logs'
LOCAL_PATH: $(pwd)
EXTRA_ARGS: "--exclude=* --include=2020*.txt"

 

Question: Do we have a pipeline example for Elastic beanstalk multi-container docker deployment ?
Recipe: Working example for deploying multicontainer Docker environment as an AWS Elastic Beanstalk application to AWS Cloud using aws-elasticbeanstalk-deploy pipe you can find in the example repository bitbucketpipelines/example-aws-elasticbeanstalk-deploy-docker-multicontainer

 

Question: When I deploy my files with ftp-deploy, it uses lots of minutes? How to optimize it?
Recipe: Use pipe rsync-deploy for your case.

According to Rsync man page:

Rsync is famous for its delta-transfer algorithm, which reduces the amount of data sent over the network by sending only the differences between the source files and the existing files in the destination.

Rsync finds files that need to be transferred using a "quick check" algorithm (by default) that looks for files that have changed in size or in last-modified time. Any changes in the other preserved attributes (as requested by options) are made on the destination file directly when the quick check indicates that the file’s data does not need to be updated.

script:
- pipe: atlassian/rsync-deploy:0.4.2
variables:
USER: 'ec2-user'
SERVER: '127.0.0.1'
REMOTE_PATH: '/var/www/build/'
LOCAL_PATH: 'build'

 

Question: I'm trying to follow the guide to manage multiple repository ssh keys . I've added my ssh as a secure variable, encoded in base64. But how do I decode it to pass it into the pipe's variables (parameters)?
Recipe: You should only provide encoded SSH_KEY as a variable and pipe will decode it to the ~/.ssh/pipelines_id automatically.

 

More recipes you could find by following the tag recipe in the Atlassian Community.

 

0 comments

Comment

Log in or Sign up to comment
TAGS
AUG Leaders

Atlassian Community Events