Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Optimizing a bitbucket pipeline

Naveed Zahoor
I'm New Here
I'm New Here
Those new to the Atlassian Community have posted less than three times. Give them a warm welcome!
September 25, 2023


I am working on a pipeline where i have to build 3 images in parallel step. All of these three image refer to same project. One is main API and two are worker images. I have 3 docker files, one for each. All of these 3 images fetch same PHP version, install same package from ubuntu repo and then install composer dependencies. After that it sets up all 3 images separately. For example API images exposes port 80 endpoint and worker goes with supervisor. 

What I am trying to achieve:

My first though was to build an image and upload it to ecr and use that image in each of these image as base. But using this doesn't  guarantee that my dependencies will be up to date. I am looking for a way to build one image in same pipeline and use it in next step as base image so i don't have to install dependencies separately for all 3 containers. is it possible or is there a better solution than this that i can try?

1 answer

0 votes
Patrik S
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
September 26, 2023

Hello @Naveed Zahoor and welcome to the Community!

You could build the base image in a "normal" step, use docker save to export the image to a file within the build, and use artifacts to make this file available in the following parallel steps where you build the specifics of each image.

Within the parallel steps, you can use the docker load command to load the image from the artifact file, and this image will be available in that step's docker environment. You can then make use of this "local image" in your Dockerfile.

Following is an example of YML using the suggestion above : 

  docker: true #enable docker service for all the steps
image: ubuntu:latest
      - step:
          name: "Build base image and export"
            - docker build -t mydockerrepo/imagename .
            - docker save --output base-image.docker mydockerrepo/imagename #save the base image to a file named base-image.docker and tagged as mydockerrepo/test
            - base-image.docker #export base image file as artifact to next steps
      - parallel:      
          - step:
name: "Build Image 1"
                - docker load --input ./base-image.docker #load the base image from the exported artifact
                - docker images # list available images, you should see the base image
                - docker build -t mydockerrepo/image1 . -f dev.Dockerfile #build a new docker image
          - step:
name: "Build Image 2"
                - docker load --input ./base-image.docker #load the base image from the exported artifact
                - docker images # list available images, you should see the base image
                - docker build -t mydockerrepo/image2 . -f prod.Dockerfile

In this example, the dev.Dockerfile and prod.Dockerfile are using the image built on the first step as the base : 

FROM mydockerrepo/imagename
<rest of the commands>

Hope that helps! You can try using that approach and let us know how it goes :)

Thank you, @Naveed Zahoor !

Patrik S

Suggest an answer

Log in or Sign up to answer
AUG Leaders

Atlassian Community Events