You're on your way to the next level! Join the Kudos program to earn points and save your progress.
Level 1: Seed
25 / 150 points
Next: Root
1 badge earned
Challenges come and go, but your rewards stay with you. Do more to earn more!
What goes around comes around! Share the love by gifting kudos to your peers.
Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!
Join now to unlock these features and more
The Atlassian Community can help you and your team get more value out of Atlassian products and practices.
Hi all,
we are running Magento 2 on an AWS Infrastructure (Single EC2 instance, Elastic Search Serves, DB Service).
We have a Prod and a seperate Stage environment up and running.
Right now our deployment is quite manual process
- Pushe code to repo, create PR to stage branch
- Connect to Stage isntance via SSH, git pull stage branch, execute magento commands like reindex, restart etc.
- Test
- PR to master branch and same as above (Connect to instance, pull master brnach, execute commands)
I would like to improve this by using a more automated process with for example BitBucket pipelines. So far I did understand that I could run SCP to transfer my code to my environments but how to execute the update commands on each server or update db for example?
And also, how can I have different config files per environment? Right now we just have the config files on gitignore and do not have them on version control and manage them manually on the server.
I hope you can give me an insight where to start and how all this could look like.
Right now I am lost and dont understand how to improve this and where to start how the final picture could look like.
Thanks in advance!