Come for the products,
stay for the community

The Atlassian Community can help you and your team get more value out of Atlassian products and practices.

Atlassian Community about banner
Community Members
Community Events
Community Groups

Docker build in pipelines doesn't cache build process?

Is there a way to cache build process for docker build?

Now, everytime the build is started, it downloads the parent image again and build goes through every Dockerfile command all over again, so the build in whole stretches up to 6 minutes.


Docker engine is capable of caching images and build commands that weren't changed.

I couldn't find any documentation on docker cashing ( found the ones for npm for example ). Is there a support for this but it's not documented ?  Or, if not, is this somewhere in the Pipelines roadmap?


THanks a lot!


2 answers

I got my docker builds caching but after doing a merge a week later, looks like the caches don't last longer than a week 

Screenshot 2021-02-03 at 16.00.04.png

0 votes
SebC Atlassian Team Nov 09, 2017

Hey @msvoren,

Not at this time; follow to get updates as we implement this feature.



This feature has been shipped for customers to use here is the details on how to opt-in to this docker layer cache.

      - docker
      - docker
      - # .. do cool stuff with Docker .. 


Like Carlos_Ferreyra likes this

Hi @Trevor Mack,

Probably this feature does not correctly work, for example in my pipeline it caches only 2 layers:

Status: Downloaded newer image for ruby:2.5.0

---> 213a086149f6


---> Using cache

---> 429c813b6abf

Step 3/18 : ENV RACK_ENV production

---> Using cache

---> c0e98fc7cea0

Step 4/18 : ENV RAILS_ENV production

---> Running in 62f18ad58da3

Removing intermediate container 62f18ad58da3

---> a7f6146135a4


---> Running in d0f15c5589fc

Removing intermediate container d0f15c5589fc 


Like # people like this


Did you get this to work?

Like Mohamed Bana likes this

I'm getting this same issue now.

Though the step that seems to never be cached is one where I run `apt-get update && apt-get install <stuff>`

Sending build context to Docker daemon 1.485MB
Step 1/24 : FROM php:7.3.9-apache
7.3.9-apache: Pulling from library/php
Digest: sha256:2f3a147ca94e87210d596ecb31533c127928bdb9cc4f2dc3364d80a3dd6024e4
Status: Downloaded newer image for php:7.3.9-apache
---> 6f7c5e29a126
Step 2/24 : ENV APACHE_DOCUMENT_ROOT /src/public
---> Using cache
---> 2594124b1607
Step 3/24 : RUN mkdir /src
---> Using cache
---> 4534159c5c9c
Step 4/24 : WORKDIR /src
---> Using cache
---> a4c50b69d084
Step 5/24 : RUN apt-get update && apt-get install -y unzip libpng-dev libjpeg62-turbo-dev locales apt-transport-https gnupg2
---> Running in 1c394017dd4a
Trevor Mack Atlassian Team Apr 06, 2020

When installing and doing an `apt-get update` likely these steps will never be cached as they fetch the metadata of the latest released packages before continuing.

@Trevor Mack but this is standard Docker stuff to do `apt-get update && apt-get install ...`, and caching of it has always worked fine when doing this locally in Docker outside of Bitbucket Pipelines.

Like # people like this

It's a horrible idea :( but im considering having "base" images which I then make the dynamic changes into since the build process takes "many minutes" when with caching it would take seconds.  


FROM my-base-build-image

RUN do some dynamic stuff here


But ultimately this is a super poor way to do and isn't best practise.  Pipelines take so much longer than they should.


I guess the potential overheads of caching everyones stuff would be massive.   

We're having the same issue. All layers after a line with `apt-get update` and `apt-get install` are being fully rebuilt, always. It's not the default Docker behavior and even recommended to always combine the two on one line.

Like Erasmus Schröder likes this

well this is ridiculous, local docker build fully caches, yet in my pipeline with caches: docker it seems to do almost nothing resulting in full build which takes minutes :(

Like # people like this

This is still happening. It will cache the `ADD yarn.lock` but then wont cache the `ADD Gemfile.lock`. 

Like Zogoo likes this

Any update? I think Bitbucket Docker cache has a very poor feature. Which is only allowed "cache: docker". And saying leave to the decision to Atlassian team.

So, It's not working for "RUN bundle install". 

Why you guys just allow to us what folder must be cached or not like normal pipeline cache?

Then at least please allow to us that we can choose a folder that we want to cache.



What's the status on this? @SebC @Trevor Mack 

Only a few arbitrary layers seem to be cached even with cache: docker.

A simple COPY package.json ./ fails the cache, even when building the same commit in a new build right after the first one.

IMHO this is unworthy of a professional building platform.

Like # people like this

Is it not updating the cache on every single pipeline build?

Is it creating a cache on the first build and not re populating it until the cache is expired?

I just looked at my caches in the top right of the pipelines build and the cache was created on the 23rd of September. We run builds every day, excluding the weekends, and I just now completed a build and the build cache has the same date on it.

That would definitely explain the behavior everyone is seeing.

If so that is definitely not how any other build cache works, and has to cost bitbucket more in build minutes than it does in replacing cache data.

and now I just saw this in the logs...

Skipping assembly of docker cache as one is already presentCache "docker": Skipping upload for existing cache

That has to be what's going on here. Why wouldn't Atlassian update the cache on every single successful build? What is the point of a week old cache? And why would the arbitrary build that happens to run right after the invalid cache be any more valid than any other? Currently it's being used like it's a thoughtfully curated cache.

Is there any way to force update the cache as part of the build?

Like # people like this

Thank you @Jordan Davidson for putting this into plain words. That cleared some things up. That is exactly the reason why layer caching does not work for builds.

By the way, nine months passed since you wrote that – no improvement here still.

That means that this is a feature that we are all expecting to have, but it is simply not implemented by the bitbucket team.

Let's hope that this gets fixed in the near future somehow.

As for now, I think the only upside of docker caching is not to have base images pulled in every time you do a `docker pull` or `docker build`. That is only if your build configuration did not change :) If you wanted to include another image into your build, you'd have to invalidate the cache to make it work.

I've had this same problem today and can confirm that deleting the Docker Cache forced Pipelines to cache new image layers for my new build.

I'm sure that this was working in the past and my Pipeline would build a Docker image using the cache even when I'd changed my Dockerfile such that it invalidated layers.

if you are using a custom runner, this might help. tho i'm not sure how much:

This is still an open bug, and nothing is being done about it.

It seems like development on Pipelines has largely been abandoned by Atlassian.

Like Modnar1226 likes this

Suggest an answer

Log in or Sign up to answer
Community showcase
Published in Bitbucket

Git push size limits are coming to Bitbucket Cloud starting April 4th, 2022

Beginning on April 4th, we will be implementing push limits. This means that your push cannot be completed if it is over 3.5 GB. If you do attempt to complete a push that is over 3.5 GB, it will fail...

3,452 views 3 10
Read article

Atlassian Community Events