Currently, build and deploy are done in one step, and I'm trying to modify the .yml to separate the build and deploy steps.
From the above, it is assumed that only the deploy steps from the pipeline history can be "redeployed".
However, the build results written to "opt / atlassian / pipes / agent / build / **" seem to be deleted when the run is complete.
In order to perform "redeploy", I think that it is necessary to upload the build result to storage such as Amazon S3 by using "artifact" in .yml.
Please tell me how to create .yml scripts below.
-How to upload build results to Amazon S3 using "artifacts".
-How to deploy from the build result uploaded to Amazon S3.
We use a similar step in our bitbucket-pipelines.yml to create a "download" from the artifact (build result files). It can be implemented in one line with curl.
If you want to use S3, you should use the AWS command line client and I'm quite sure it will work.
Thank you for your reply that curl can be realized in one line.
I have set the following yml, and I was able to confirm up to the point where "Redeploy" can be done.
In "Artifact", it will be deleted in 14 days.
so create a directory with BITBUCKET_BUILD_NUMBER specified on the S3 side, save the build result,
and even when deploying (aws s3 sync), can you deploy with the above directory specified ?
===
pipelines:
branches:
develop:
- step:
name: Build(Dev)
script:
- npm install
- chmod +x ./deploy.sh
- STAGE=dev ./deploy.sh
- npm run build:dev
artifacts:
- out/**
- step:
name: Deploy(Dev)
deployment: Staging
script:
- aws configure set default.aws_secret_access_key ${AWS_SECRET_ACCESS_KEY}
- aws configure set default.aws_access_key_id ${AWS_ACCESS_KEY_ID}
- aws configure set default.region ${AWS_DEFAULT_REGION}
- export AWS_SDK_LOAD_CONFIG=1
- aws s3 sync ./out/ s3://hogehoge
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Your answer was very helpful. Thank you very much.
As a result, I was able to solve it with the following .yml, so I will share it.
===
pipelines:
branches:
develop:
-step:
name: Build (Dev)
script:
-aws configure set default.aws_secret_access_key "****"
-aws configure set default.aws_access_key_id "****"
-aws configure set default.region "****"
-export AWS_SDK_LOAD_CONFIG = 1
-npm install
-chmod + x ./deploy.sh
-STAGE = dev ./deploy.sh
-npm run build: dev
-aws s3 sync ./out/ s3: // hogehoge-backup / $ BITBUCKET_BUILD_NUMBER
-step:
name: Deploy (Dev)
deployment: Staging
script:
-aws configure set default.aws_secret_access_key $ {AWS_SECRET_ACCESS_KEY}
-aws configure set default.aws_access_key_id $ {AWS_ACCESS_KEY_ID}
-aws configure set default.region $ {AWS_DEFAULT_REGION}
-export AWS_SDK_LOAD_CONFIG = 1
-aws s3 sync s3: // hogehoge-backup / $ BITBUCKET_BUILD_NUMBER s3: // hogehoge-stg
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi @SatoruNakada @Aron Gombas _Midori_ ,
Also, you could simplify your pipeline with pipes. aws-s3-deploy pipe syncs directories and S3 prefixes. Recursively copies new and updated files from the source local directory to the destination:
script: - pipe: atlassian/aws-s3-deploy:1.1.0 variables: S3_BUCKET: 'my-bucket-name' LOCAL_PATH: 'build'
Thanks for contributing to the Atlassian community!
Cheers,
Oleksandr Kyrdan
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.