Come for the products,
stay for the community

The Atlassian Community can help you and your team get more value out of Atlassian products and practices.

Atlassian Community about banner
4,293,448
Community Members
 
Community Events
165
Community Groups

How to write .yml to roll back to the previous build & deploy using Artifact in Bitbucket Pipelines

Edited


Currently, build and deploy are done in one step, and I'm trying to modify the .yml to separate the build and deploy steps.
From the above, it is assumed that only the deploy steps from the pipeline history can be "redeployed".
However, the build results written to "opt / atlassian / pipes / agent / build / **" seem to be deleted when the run is complete.
In order to perform "redeploy", I think that it is necessary to upload the build result to storage such as Amazon S3 by using "artifact" in .yml.
Please tell me how to create .yml scripts below.
-How to upload build results to Amazon S3 using "artifacts".
-How to deploy from the build result uploaded to Amazon S3.

1 answer

1 accepted

1 vote
Answer accepted

We use a similar step in our bitbucket-pipelines.yml to create a "download" from the artifact (build result files). It can be implemented in one line with curl.

If you want to use S3, you should use the AWS command line client and I'm quite sure it will work.

Thank you for your reply that curl can be realized in one line.

I have set the following yml, and I was able to confirm up to the point where "Redeploy" can be done.

In "Artifact", it will be deleted in 14 days.
so create a directory with BITBUCKET_BUILD_NUMBER specified on the S3 side, save the build result,
and even when deploying (aws s3 sync), can you deploy with the above directory specified ?

===

pipelines:
 branches:
  develop:
   - step:
     name: Build(Dev)
     script:
      - npm install
      - chmod +x ./deploy.sh
      - STAGE=dev ./deploy.sh
      - npm run build:dev
     artifacts:
     - out/**
   - step:
     name: Deploy(Dev)
     deployment: Staging
     script:
      - aws configure set default.aws_secret_access_key ${AWS_SECRET_ACCESS_KEY}
      - aws configure set default.aws_access_key_id ${AWS_ACCESS_KEY_ID}
      - aws configure set default.region ${AWS_DEFAULT_REGION}
      - export AWS_SDK_LOAD_CONFIG=1
      - aws s3 sync ./out/ s3://hogehoge

@Aron Gombas _Midori_ 

Your answer was very helpful. Thank you very much.

As a result, I was able to solve it with the following .yml, so I will share it.
===

pipelines:
branches:
develop:
-step:
name: Build (Dev)
script:
-aws configure set default.aws_secret_access_key "****"
-aws configure set default.aws_access_key_id "****"
-aws configure set default.region "****"
-export AWS_SDK_LOAD_CONFIG = 1
-npm install
-chmod + x ./deploy.sh
-STAGE = dev ./deploy.sh
-npm run build: dev
-aws s3 sync ./out/ s3: // hogehoge-backup / $ BITBUCKET_BUILD_NUMBER
-step:
name: Deploy (Dev)
deployment: Staging
script:
-aws configure set default.aws_secret_access_key $ {AWS_SECRET_ACCESS_KEY}
-aws configure set default.aws_access_key_id $ {AWS_ACCESS_KEY_ID}
-aws configure set default.region $ {AWS_DEFAULT_REGION}
-export AWS_SDK_LOAD_CONFIG = 1
-aws s3 sync s3: // hogehoge-backup / $ BITBUCKET_BUILD_NUMBER s3: // hogehoge-stg
Like # people like this

Hi @SatoruNakada @Aron Gombas _Midori_ ,

Also, you could simplify your pipeline with pipes. aws-s3-deploy pipe syncs directories and S3 prefixes. Recursively copies new and updated files from the source local directory to the destination:

script:
  - pipe: atlassian/aws-s3-deploy:1.1.0
    variables:
      S3_BUCKET: 'my-bucket-name'
      LOCAL_PATH: 'build'

Thanks for contributing to the Atlassian community!

 

Cheers,
Oleksandr Kyrdan

Like # people like this

Suggest an answer

Log in or Sign up to answer
DEPLOYMENT TYPE
CLOUD
TAGS
Community showcase
Published in Bitbucket

Git push size limits are coming to Bitbucket Cloud starting April 4th, 2022

Beginning on April 4th, we will be implementing push limits. This means that your push cannot be completed if it is over 3.5 GB. If you do attempt to complete a push that is over 3.5 GB, it will fail...

2,089 views 2 9
Read article

Community Events

Connect with like-minded Atlassian users at free events near you!

Find an event

Connect with like-minded Atlassian users at free events near you!

Unfortunately there are no Community Events near you at the moment.

Host an event

You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events

Events near you