I'm new to Pipelines.
I'm building C++ projects in one Docker image from Docker Hub, which is 1GB. My bitbucket-pipeline.yml is something like:
```
image: whatever/whatever:latest
pipelines:
custom:
default:
- step:
script:
- echo step1
- step:
script:
- echo step2
#many more steps
```
Every time the pipeline runs, at every step it spends 20-40 seconds pulling the whatever/whatever Docker image!
Is there a way to cache or reuse the same image? Is Atlassian actually downloading 1GB at every step? I thought I could use "caches: Docker" to keep the image around, but it turns that keyword is for pipelines that create/push Docker images.
I feel like I'm doing something wrong. I can't imagine Atlassian wanting to download 10GB every time I make a commit, just because I have 10 steps defined.
@Fabregas4 Please have a look at this answer.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.