Pipelines experts,
I have a few pipelines that share large data sets. I currently have this set up to use a named data volume container as the Docker docs specify. But I haven't seen any way to tell Pipelines to start my docker image with this named data container referenced. This seems to be the "Docker Way" and I'd have to go to some interesting workarounds if it wasn't available.
Any pointers for me?
Thanks!
Not sure if there's another way to bump this thread, but I still haven't figured out a workaround and it's now hampering my ability to use Pipelines. Does anyone have any thoughts on this? Anything for me to try?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.