Is there a way to cache build process for docker build?
Now, everytime the build is started, it downloads the parent image again and build goes through every Dockerfile command all over again, so the build in whole stretches up to 6 minutes.
Docker engine is capable of caching images and build commands that weren't changed.
I couldn't find any documentation on docker cashing ( found the ones for npm for example ). Is there a support for this but it's not documented ? Or, if not, is this somewhere in the Pipelines roadmap?
THanks a lot!
I got my docker builds caching but after doing a merge a week later, looks like the caches don't last longer than a week https://support.atlassian.com/bitbucket-cloud/docs/cache-dependencies/
Hey @msvoren,
Not at this time; follow https://bitbucket.org/site/master/issues/14144/cache-docker-layers-between-builds to get updates as we implement this feature.
-Seb
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
This feature has been shipped for customers to use here is the details on how to opt-in to this docker layer cache.
pipelines: default: services: - docker caches: - docker script: - # .. do cool stuff with Docker ..
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi @Trevor Mack,
Probably this feature does not correctly work, for example in my pipeline it caches only 2 layers:
Status: Downloaded newer image for ruby:2.5.0
---> 213a086149f6
Step
2/18 : ARG DATABASE_URL
---> Using cache
---> 429c813b6abf
Step 3/18 : ENV RACK_ENV production
---> Using cache
---> c0e98fc7cea0
Step 4/18 : ENV RAILS_ENV production
---> Running in 62f18ad58da3
Removing intermediate container 62f18ad58da3
---> a7f6146135a4
Step 5/18 : ENV DATABASE_URL $DATABASE_URL
---> Running in d0f15c5589fc
Removing intermediate container d0f15c5589fc
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Bump!
Did you get this to work?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
I'm getting this same issue now.
Though the step that seems to never be cached is one where I run `apt-get update && apt-get install <stuff>`
Sending build context to Docker daemon 1.485MB
Step 1/24 : FROM php:7.3.9-apache
7.3.9-apache: Pulling from library/php
Digest: sha256:2f3a147ca94e87210d596ecb31533c127928bdb9cc4f2dc3364d80a3dd6024e4
Status: Downloaded newer image for php:7.3.9-apache
---> 6f7c5e29a126
Step 2/24 : ENV APACHE_DOCUMENT_ROOT /src/public
---> Using cache
---> 2594124b1607
Step 3/24 : RUN mkdir /src
---> Using cache
---> 4534159c5c9c
Step 4/24 : WORKDIR /src
---> Using cache
---> a4c50b69d084
Step 5/24 : RUN apt-get update && apt-get install -y unzip libpng-dev libjpeg62-turbo-dev locales apt-transport-https gnupg2
---> Running in 1c394017dd4a
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
When installing and doing an `apt-get update` likely these steps will never be cached as they fetch the metadata of the latest released packages before continuing.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
@Trevor Mack but this is standard Docker stuff to do `apt-get update && apt-get install ...`, and caching of it has always worked fine when doing this locally in Docker outside of Bitbucket Pipelines.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
It's a horrible idea :( but im considering having "base" images which I then make the dynamic changes into since the build process takes "many minutes" when with caching it would take seconds.
e.g.
FROM my-base-build-image
RUN do some dynamic stuff here
But ultimately this is a super poor way to do and isn't best practise. Pipelines take so much longer than they should.
I guess the potential overheads of caching everyones stuff would be massive.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
We're having the same issue. All layers after a line with `apt-get update` and `apt-get install` are being fully rebuilt, always. It's not the default Docker behavior and even recommended to always combine the two on one line.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
well this is ridiculous, local docker build fully caches, yet in my pipeline with caches: docker it seems to do almost nothing resulting in full build which takes minutes :(
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
This is still happening. It will cache the `ADD yarn.lock` but then wont cache the `ADD Gemfile.lock`.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Any update? I think Bitbucket Docker cache has a very poor feature. Which is only allowed "cache: docker". And saying leave to the decision to Atlassian team.
So, It's not working for "RUN bundle install".
Why you guys just allow to us what folder must be cached or not like normal pipeline cache?
Then at least please allow to us that we can choose a folder that we want to cache.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
What's the status on this? @SebC @Trevor Mack
Only a few arbitrary layers seem to be cached even with cache: docker.
A simple COPY package.json ./ fails the cache, even when building the same commit in a new build right after the first one.
IMHO this is unworthy of a professional building platform.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Is it not updating the cache on every single pipeline build?
Is it creating a cache on the first build and not re populating it until the cache is expired?
I just looked at my caches in the top right of the pipelines build and the cache was created on the 23rd of September. We run builds every day, excluding the weekends, and I just now completed a build and the build cache has the same date on it.
That would definitely explain the behavior everyone is seeing.
If so that is definitely not how any other build cache works, and has to cost bitbucket more in build minutes than it does in replacing cache data.
and now I just saw this in the logs...
Skipping assembly of docker cache as one is already present
Cache "docker": Skipping upload for existing cache
That has to be what's going on here. Why wouldn't Atlassian update the cache on every single successful build? What is the point of a week old cache? And why would the arbitrary build that happens to run right after the invalid cache be any more valid than any other? Currently it's being used like it's a thoughtfully curated cache.
Is there any way to force update the cache as part of the build?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thank you @Jordan Davidson for putting this into plain words. That cleared some things up. That is exactly the reason why layer caching does not work for builds.
By the way, nine months passed since you wrote that – no improvement here still.
That means that this is a feature that we are all expecting to have, but it is simply not implemented by the bitbucket team.
Let's hope that this gets fixed in the near future somehow.
As for now, I think the only upside of docker caching is not to have base images pulled in every time you do a `docker pull` or `docker build`. That is only if your build configuration did not change :) If you wanted to include another image into your build, you'd have to invalidate the cache to make it work.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
I've had this same problem today and can confirm that deleting the Docker Cache forced Pipelines to cache new image layers for my new build.
I'm sure that this was working in the past and my Pipeline would build a Docker image using the cache even when I'd changed my Dockerfile such that it invalidated layers.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
if you are using a custom runner, this might help. tho i'm not sure how much:
https://bitbucket.org/blog/faster-ci-builds-with-docker-remote-caching
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
This is still an open bug, and nothing is being done about it.
It seems like development on Pipelines has largely been abandoned by Atlassian.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.