I have a bitbucket-pipelines.yml file that looks something like this:
image: atlassian/default-image:2
pipelines:
custom:
my_pipeline:
- step:
services:
- docker
caches:
- docker
script:
- docker run postman/newman_alpine33:3.9.3
Pulling the Docker Hub image takes around 5 minutes, and I'd love to shave that time off my builds if possible.
Is there a way to pull this image from Bitbucket's Docker cache? The "caches" declaration doesn't seem to help me here.
Hello Chris,
We have two separate Docker caches.
One is the "caches" definition in the YAML. This is primarily used for caching Docker images you are building. However, it will also act as a cache in your repository for images you pull. However, this cache is limited to 1GB, so it may not be large enough for images.
We have another internal Dockerhub cache that caches *public images* from *Dockerhub*. Private images and public images from other registries will not be cached.
I checked our internal Dockerhub cache and the image is present. When I tried pulling it completed in a few seconds. Perhaps removing the caches definition in your YAML may speed things up in this case?
As a side note, it looks like that image is deprecated in favour of postman/newman.
Thanks,
Phil
Hi Phil, thanks for taking a look for me.
I've removed the Docker cache recently, and looking at the recent logs it looks like the image is getting pulled in quickly now. So either it was the cache definition, or possibly there was just a long delay in the first pipeline logs becoming available and I was mistaking that for a slow Docker pull. Apologies if it was the latter!
I'm using the old Newman image because more recent versions produce a lot of extraneous output that I don't want in the response. I might look into raising a PR to make that output configurable when I've got time.
I think I'm all good now, thanks for your time :)
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
@Philip Hodder
Is there a enduser api call/up-to-date web resource to check what images are in the internal Dockherhub cache?
Based on your response and documentation, is it possible to cache smaller docker images using a YAML cache and use them as bases?
Example:
If this usecase/pattern is possible, is there a cost associated with it? Does the repo's pipeline get cleared at automated intervals?
cheers
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.