Come for the products,
stay for the community

The Atlassian Community can help you and your team get more value out of Atlassian products and practices.

Atlassian Community about banner
Community Members
Community Events
Community Groups

Caching a public Docker Hub image

I have a bitbucket-pipelines.yml file that looks something like this:

image: atlassian/default-image:2

- step:
- docker
- docker
- docker run postman/newman_alpine33:3.9.3

Pulling the Docker Hub image takes around 5 minutes, and I'd love to shave that time off my builds if possible.

Is there a way to pull this image from Bitbucket's Docker cache? The "caches" declaration doesn't seem to help me here.

1 answer

1 accepted

0 votes
Answer accepted

Hello Chris,

We have two separate Docker caches.

One is the "caches" definition in the YAML. This is primarily used for caching Docker images you are building. However, it will also act as a cache in your repository for images you pull. However, this cache is limited to 1GB, so it may not be large enough for images.

We have another internal Dockerhub cache that caches *public images* from *Dockerhub*. Private images and public images from other registries will not be cached.

I checked our internal Dockerhub cache and the image is present. When I tried pulling it completed in a few seconds. Perhaps removing the caches definition in your YAML may speed things up in this case?

As a side note, it looks like that image is deprecated in favour of postman/newman.



Hi Phil, thanks for taking a look for me.

I've removed the Docker cache recently, and looking at the recent logs it looks like the image is getting pulled in quickly now. So either it was the cache definition, or possibly there was just a long delay in the first pipeline logs becoming available and I was mistaking that for a slow Docker pull. Apologies if it was the latter!

I'm using the old Newman image because more recent versions produce a lot of extraneous output that I don't want in the response. I might look into raising a PR to make that output configurable when I've got time.

I think I'm all good now, thanks for your time :)

Like Philip Hodder likes this

@Philip Hodder 

Is there a enduser api call/up-to-date web resource to check what images are in the internal Dockherhub cache? 

Based on your response and documentation, is it possible to cache smaller docker images using a YAML cache and use them as bases?


  • Define a custom pipeline which builds a docker image named "main-base"
  • Define a YAML cache to store "main-base"
  • Successfully run aforementioned pipeline to build "main-base"
  • Define a different custom pipeline which uses "main-base" to build "final-image"

If this usecase/pattern is possible, is there a cost associated with it? Does the repo's pipeline get cleared at automated intervals?



Suggest an answer

Log in or Sign up to answer
Community showcase
Published in Bitbucket

Git push size limits are coming to Bitbucket Cloud starting April 4th, 2022

Beginning on April 4th, we will be implementing push limits. This means that your push cannot be completed if it is over 3.5 GB. If you do attempt to complete a push that is over 3.5 GB, it will fail...

2,156 views 2 9
Read article

Community Events

Connect with like-minded Atlassian users at free events near you!

Find an event

Connect with like-minded Atlassian users at free events near you!

Unfortunately there are no Community Events near you at the moment.

Host an event

You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events

Events near you