I know that each step runs in in a separate docker container, however, I am unsure of the architecture of pipelines beyond that?
The reason is that I have multiple parallel steps that are reasonable 'heavy' in that they need the 2x compute capability. I want to know if I can continue to add more of these steps in parallel as time goes on OR am I going to hit a ceiling.
i.e. does a pipeline run on one machine and all parallel steps get run on the same machine? Or does each step run on different machines?