How can I run a custom command before running a provided pipeline?
Specifically I want to:
poetry export -f requirements.txt --output requirements.txt
then run the google-app-engine-deploy pipeline.
I understand the 'steps' are in different docker containers, so I can't run this in the previous step, correct?
After seeing this used in ssh-run for envsubst, I found out you can simply:
- step:
script:
- pip install poetry
- poetry export --without-hashes -f requirements.txt --output requirements.txt
- pipe: atlassian/google-app-engine-deploy:0.2.1
Most examples with pipes have only a single 'line', and the step/script terminology was a bit confusing, but this works fine.
@Sander Land you can share your output files via artifacts section.
As for google app engine pipe, we are passing the volume (working dir) to the docker containers, so that you can share requirements.txt in artifacts section and in the next step you just run google app engine pipe in this dir where requirements is in.
The question will it be accepted by google app engine cloud is to google cloud: if they accept requirements file in that way, then yes, it should work
Keep contacting us and talk if more questions arise, we are happy to help you
Regards, Galyna
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.