Looking at the documentation for Bamboo, I see that the SCP job type is available for copying files from the agent to somewhere else.
But it seems, based on my understanding of Bamboo, that this is completely useless, because I have no way to ensure that the files I want to copy will exist.
What am I missing?
thank you for explaining your pain point with a clear and structured message =]
We shouldn't try to use any result from another job in the same stage. As you correctly mentioned we have no guarantees that they will finish on a certain order. Jobs are intended to run in parallel to reduce build time when doing tasks that have no dependency among each other. That is why we also have stages.
Stages will run in sequence and should be used in case we need two different jobs running one after another.
The best way to share data from one job to another is by using artifacts. Stage X publishes the artifact. Stage Y, which runs after X (right after or multiple stages later), will download the artifact. This will assure that a job can run on any server and still have the specific files needed from a previous job to deploy.
I hope I could clarify the scenario for you. If not, please let me know what questions you still have.
I'll share a link to a video which I believe is extremely useful (although it is already old) to understand the Bamboo building blocks.
This is the one:
In summary, I do recommend you to use stages and artifacts to make sure you have the files you need to deploy via SCP task.