I have a pipelines setup using my own docker container configured with the exact same LAMP stack as I am using in production. However, I simply cannot get a simple sanity check for the presence of the web page to run and it would appear that apache has not started.
Apache is supposed to be started that is executed as part of the docker container and this all works fine when running locally and the failing test works fine. Is there something special I need to do the execute dockerfile CMD statements as part of the pipeline?
Pipelines overrides the CMD in order to run the script in your bitbucket-pipelines.yml file.
To emulate our Docker run command you can look here: https://confluence.atlassian.com/display/BITBUCKET/Debug+your+pipelines+locally+with+Docker#DebugyourpipelineslocallywithDocker-Step2:LogintoyourDockercontainer
You should see Apache not starting locally if you following the command in the above link.
As a work around, you can take the CMD statement and make it the first command of your script.
You may need to run the command as a background process, and it's recommend you use SupervisorD if you want to run daemons in your container (to make sure things don't get flakey). https://docs.docker.com/engine/admin/using_supervisord/
Bitbucket Pipelines helps me manage and automate a number of serverless deployments to AWS Lambda and this is how I do it. I'm building Node.js Lambda functions using node-lambda ...
Connect with like-minded Atlassian users at free events near you!Find a group
Connect with like-minded Atlassian users at free events near you!
Unfortunately there are no AUG chapters near you at the moment.Start an AUG
We're bringing product updates and pro tips on teamwork to ten cities around the world.Save your spot