Background:
I am working on a pipeline where i have to build 3 images in parallel step. All of these three image refer to same project. One is main API and two are worker images. I have 3 docker files, one for each. All of these 3 images fetch same PHP version, install same package from ubuntu repo and then install composer dependencies. After that it sets up all 3 images separately. For example API images exposes port 80 endpoint and worker goes with supervisor.
What I am trying to achieve:
My first though was to build an image and upload it to ecr and use that image in each of these image as base. But using this doesn't guarantee that my dependencies will be up to date. I am looking for a way to build one image in same pipeline and use it in next step as base image so i don't have to install dependencies separately for all 3 containers. is it possible or is there a better solution than this that i can try?
Hello @Naveed Zahoor and welcome to the Community!
You could build the base image in a "normal" step, use docker save to export the image to a file within the build, and use artifacts to make this file available in the following parallel steps where you build the specifics of each image.
Within the parallel steps, you can use the docker load command to load the image from the artifact file, and this image will be available in that step's docker environment. You can then make use of this "local image" in your Dockerfile.
Following is an example of YML using the suggestion above :
options:
docker: true #enable docker service for all the steps
image: ubuntu:latest
pipelines:
default:
- step:
name: "Build base image and export"
script:
- docker build -t mydockerrepo/imagename .
- docker save --output base-image.docker mydockerrepo/imagename #save the base image to a file named base-image.docker and tagged as mydockerrepo/test
artifacts:
- base-image.docker #export base image file as artifact to next steps
- parallel:
steps:
- step:
name: "Build Image 1"
script:
- docker load --input ./base-image.docker #load the base image from the exported artifact
- docker images # list available images, you should see the base image
- docker build -t mydockerrepo/image1 . -f dev.Dockerfile #build a new docker image
- step:
name: "Build Image 2"
script:
- docker load --input ./base-image.docker #load the base image from the exported artifact
- docker images # list available images, you should see the base image
- docker build -t mydockerrepo/image2 . -f prod.Dockerfile
In this example, the dev.Dockerfile and prod.Dockerfile are using the image built on the first step as the base :
FROM mydockerrepo/imagename
<rest of the commands>
Hope that helps! You can try using that approach and let us know how it goes :)
Thank you, @Naveed Zahoor !
Patrik S
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.