I'm new to Pipelines.
I'm building C++ projects in one Docker image from Docker Hub, which is 1GB. My bitbucket-pipeline.yml is something like:
- echo step1
- echo step2
#many more steps
Every time the pipeline runs, at every step it spends 20-40 seconds pulling the whatever/whatever Docker image!
Is there a way to cache or reuse the same image? Is Atlassian actually downloading 1GB at every step? I thought I could use "caches: Docker" to keep the image around, but it turns that keyword is for pipelines that create/push Docker images.
I feel like I'm doing something wrong. I can't imagine Atlassian wanting to download 10GB every time I make a commit, just because I have 10 steps defined.
We are excited to announce the open beta program for self-hosted runners. Bitbucket Pipelines Runners is available to everyone. Please try it and let us know your feedback. If you have any issue...
Connect with like-minded Atlassian users at free events near you!Find an event
Connect with like-minded Atlassian users at free events near you!
Unfortunately there are no Community Events near you at the moment.Host an event
You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events