Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in
Celebration

Earn badges and make progress

You're on your way to the next level! Join the Kudos program to earn points and save your progress.

Deleted user Avatar
Deleted user

Level 1: Seed

25 / 150 points

Next: Root

Avatar

1 badge earned

Collect

Participate in fun challenges

Challenges come and go, but your rewards stay with you. Do more to earn more!

Challenges
Coins

Gift kudos to your peers

What goes around comes around! Share the love by gifting kudos to your peers.

Recognition
Ribbon

Rise up in the ranks

Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!

Leaderboard

Optimizing a bitbucket pipeline

Background:


I am working on a pipeline where i have to build 3 images in parallel step. All of these three image refer to same project. One is main API and two are worker images. I have 3 docker files, one for each. All of these 3 images fetch same PHP version, install same package from ubuntu repo and then install composer dependencies. After that it sets up all 3 images separately. For example API images exposes port 80 endpoint and worker goes with supervisor. 

What I am trying to achieve:

My first though was to build an image and upload it to ecr and use that image in each of these image as base. But using this doesn't  guarantee that my dependencies will be up to date. I am looking for a way to build one image in same pipeline and use it in next step as base image so i don't have to install dependencies separately for all 3 containers. is it possible or is there a better solution than this that i can try?

1 answer

0 votes
Patrik S
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
Sep 26, 2023

Hello @Naveed Zahoor and welcome to the Community!

You could build the base image in a "normal" step, use docker save to export the image to a file within the build, and use artifacts to make this file available in the following parallel steps where you build the specifics of each image.

Within the parallel steps, you can use the docker load command to load the image from the artifact file, and this image will be available in that step's docker environment. You can then make use of this "local image" in your Dockerfile.

Following is an example of YML using the suggestion above : 

options:
  docker: true #enable docker service for all the steps
 
image: ubuntu:latest
pipelines:
  default:
      - step:
          name: "Build base image and export"
          script:
            - docker build -t mydockerrepo/imagename .
            - docker save --output base-image.docker mydockerrepo/imagename #save the base image to a file named base-image.docker and tagged as mydockerrepo/test
          artifacts:
            - base-image.docker #export base image file as artifact to next steps
      - parallel:      
         steps:
          - step:
name: "Build Image 1"
              script:
                - docker load --input ./base-image.docker #load the base image from the exported artifact
                - docker images # list available images, you should see the base image
                - docker build -t mydockerrepo/image1 . -f dev.Dockerfile #build a new docker image
          - step:
name: "Build Image 2"
              script:
                - docker load --input ./base-image.docker #load the base image from the exported artifact
                - docker images # list available images, you should see the base image
                - docker build -t mydockerrepo/image2 . -f prod.Dockerfile

In this example, the dev.Dockerfile and prod.Dockerfile are using the image built on the first step as the base : 

FROM mydockerrepo/imagename
<rest of the commands>

Hope that helps! You can try using that approach and let us know how it goes :)

Thank you, @Naveed Zahoor !

Patrik S

Suggest an answer

Log in or Sign up to answer
DEPLOYMENT TYPE
CLOUD
TAGS
AUG Leaders

Atlassian Community Events