Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in
Celebration

Earn badges and make progress

You're on your way to the next level! Join the Kudos program to earn points and save your progress.

Deleted user Avatar
Deleted user

Level 1: Seed

25 / 150 points

Next: Root

Avatar

1 badge earned

Collect

Participate in fun challenges

Challenges come and go, but your rewards stay with you. Do more to earn more!

Challenges
Coins

Gift kudos to your peers

What goes around comes around! Share the love by gifting kudos to your peers.

Recognition
Ribbon

Rise up in the ranks

Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!

Leaderboard

Come for the products,
stay for the community

The Atlassian Community can help you and your team get more value out of Atlassian products and practices.

Atlassian Community about banner
4,553,303
Community Members
 
Community Events
184
Community Groups

CI / CD Automation with Pipelines

I am looking for some advice on the best practices for automating the movement of code through the process, to storage, to the different environments.

 

I currently have a pipeline setup to sync the dev, stage and prod branches to AWS S3, into their respective buckets, but I have run into several kinks along the way.

My goal is to go from Bitbucket -> S3 -> To version -> EC2 Server Instance

Problems I am encountering:

Zip fails in the different images, including apt-get update and install, so I am just sending all of the files across with sync. I am 99% sure this is not the best practice here.

========

development:

   - step:

        script:
          - aws s3 --region "us-west-2" rm s3://artefacts/development/

          - aws s3 sync --delete . s3://artefacts/development/

========

While this gets everything to the S3 bucket, in a my own janky way, I am struggling with getting them to the server.

Do I create a step in the pipeline for this? If so, how doe sit securely talk to the server and kick off the copy? This interaction seems logical to me, but not sure how to go about it.  AWS CLI is on the servers (windows).

As far as the pipeline branches are concerned, I am using 1 IAM role and user for all 3 branches/buckets. I feel that I should be using multiple IAM roles, each specific to the branch, then additional roles specific to the bucket, so if someone with elevated privileges changes the yaml file, that it will break, even if the branches are modified. (this one troubles me since I do not see Branch-Specific credentials in the "Environment Variables". I just added in the singe set of variables and it works on all 3, so even if I do create multiple, how does one distinguish per step, since I am not even adding them to the script at all now?

I am brand new to this, so I am sure I am overlooking the basics, but I would love help in finding the best practices for this particular scenario.

Thanks in advance for any help!

 

 

 

0 comments

Comment

Log in or Sign up to comment
TAGS
AUG Leaders

Atlassian Community Events