Hello, I am making my first steps into CD and I find Bitbucket Pipelines looks very attractive.
Up to this point, I have SCP/FTP file by file into my VPS, which starting to become bit of a problem and need a smooter process in order to be able to scale the solution. As I also want to start using Docker, I have an initial idea how my new workflow should look like, however, I find it hard to implement. So:
I'm trying to set up a work flow that looks like follows for a website.
1. Git push code to Bitbucket
2. Code is tested in Piplines
3. Code is deployed to VPS
4. VPS reloads with the new Docker image (refreshes)
I find step 3 a bit unclear when it comes to Bitbucket Pipelines and Docker.
Given that I would not like to publish my Docket image on Docker hub, would you first build the Docker image in Piplines and then push it to the VPS, or would you push the code to the VPS and then have a script that rebuilds the Docker image?
Furthermore, Where do I put the docker-compose file or is that replaced by bitbucket-pipelines?
Lastly, is above an advisable workflow for a private project or is there a standard way of doing it?
I have tried to find a step by step guide but the closest I can find is with an Heroku application.
My environment is, Ubuntu, Apache, Postgres, Django, running a website on a VPS.
Much appreciation for some guidance on this point. I am looking forward to make out as much as possible of Bitbucket Pipleline, even if I am new to it.
Hi everyone, The Cloud team recently announced 12 new DevOps features that help developers ship better code, faster ! While we’re all excited about the new improvements to Bitbucket ...
Connect with like-minded Atlassian users at free events near you!Find an event
Connect with like-minded Atlassian users at free events near you!
Unfortunately there are no Community Events near you at the moment.Host an event
You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events