Greetings, I'm having problems getting my pipelines to effectively make use of Docker layer caches in order to reduce build times.
This is a snippet of the image build step in my pipeline definition:
image: atlassian/default-image:latest
pipelines:
branches:
test:
- step:
name: Build Image
script:
- IMAGE_NAME=${DOCKERHUB_USER}/${TAG}:TEST
- docker build --secret id=REPO_USERNAME,env=REPO_USERNAME --secret id=REPO_PASSWORD,env=REPO_PASSWORD . --file Dockerfile --tag ${IMAGE_NAME}
- docker save ${IMAGE_NAME} --output "docker-image.tar"
services:
- docker
artifacts:
- "docker-image.tar"
caches:
- docker
The idea is, when this step gets re-executed, to re-use docker layers that did not change, and only update those that did. Then get this updated image to become the new cache. Just like it would happen running locally on any machine.
What is happening right now is no cache is being used. The `docker build ...` step constantly executes anew.
G'day, @LI
Welcome to the community!
We'll need to observe your pipeline build to understand why it isn't using the cache. This issue often arises if the Docker cache is within the 1GB limit, preventing new cache entries from being created or if there's a problem with the cache itself. You can verify this by checking your build teardown.
If you wish to further investigate, please submit a support ticket through our support portal.
Regards,
Syahrul
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.