How to deploy code to ec2 ubuntu using bitbucket pipelines?
below is the code for deploying code to ec2 ubuntu using gitlab cicd
stages:
- backupvm1 #backupfiles in webvm1
.before_script_stg_template:
before_script:
## Run ssh-agent (inside the build environment)
- eval $(ssh-agent -s)
- echo "EC2_KEY" | tr -d '\r' | ssh-add -
- mkdir -p ~/.ssh
- chmod 700 ~/.ssh
- echo "EC2_SSH_KNOWN_HOSTS" >> ~/.ssh/known_hosts
rules:
- if: $CI_COMMIT_BRANCH == "master"
when: always
- when: never
continuous_backup_to_vm1:
stage: backupvm1
extends:
- .before_script_stg_template
script:
- ssh -t ubuntu@$EC2_IP -o StrictHostKeyChecking=no " rm -rf folder_bkp && cp -r folder folder_bkp&& exit"
rules:
- if: $CI_PIPELINE_STATUS != "failed"
Hi @Raghupathy M and welcome to the community!
The script that you shared here seems to use SSH to connect to your EC2 server and then execute some commands.
If you want to connect to your server via SSH from Pipelines, you can:
This is for setting up SSH access and you can find more detailed steps here:
If you follow these steps, then you won't need to do the setup that your before_script does.
Afterwards, we see that the rule for the script is to get executed on master branch.
In Bitbucket Pipelines, you can configure your pipeline to run on master only with a bitbucket-pipelines.yml file as follows:
pipelines:
branches:
master:
- step:
name: Pipeline for master branch
script:
- <your command here>
You can find more details about start conditions here:
Regarding the script, we first need to understand what it is doing in order to better help you.
I see that the command that is executed on your server deletes a directory and then copies another directory. How is the deployment done in this case? I can see that the directory named folder gets copied, but does this exist already on your server? Does it get copied automatically when you run the gitlab pipeline?
In Bitbucket Pipelines that run in our own infrastructure, a Docker container starts for each step. Your repo is cloned in that container (unless you disable cloning) and then the commands from your step's script are executed. When the step is finished, the container gets destroyed.
If you want to deploy a certain folder of your repo to your server via SSH from Bitbucket Pipelines, you can use either the rsync or the scp pipe in the script of the bitbucket-pipelines.yml file of your repo:
If you want to run a command on your server, you can use the following pipe:
All three pipes require you to set up SSH access first, as I explained earlier.
Please feel free to reach out if you have any questions.
Kind regards,
Theodora
How to ensure my ssh keys are safe in bitbucket
Is there any strategy or service for ensuring safety of my AWS ec2 ssh keys? My organisation needs a secured practice for sharing awa ec2 ssh keys.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.