I'm new to Pipelines.
I'm building C++ projects in one Docker image from Docker Hub, which is 1GB. My bitbucket-pipeline.yml is something like:
- echo step1
- echo step2
#many more steps
Every time the pipeline runs, at every step it spends 20-40 seconds pulling the whatever/whatever Docker image!
Is there a way to cache or reuse the same image? Is Atlassian actually downloading 1GB at every step? I thought I could use "caches: Docker" to keep the image around, but it turns that keyword is for pipelines that create/push Docker images.
I feel like I'm doing something wrong. I can't imagine Atlassian wanting to download 10GB every time I make a commit, just because I have 10 steps defined.
Hi, Bitbucket community! Are you a DevOps practitioner (or know one in your network)? Do you have DevOps tips, tricks, or learnings you'd like to share with the community? If so, we'd love to hea...
Connect with like-minded Atlassian users at free events near you!Find an event
Connect with like-minded Atlassian users at free events near you!
Unfortunately there are no Community Events near you at the moment.Host an event
You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events