How can I run a custom command before running a provided pipeline?
Specifically I want to:
poetry export -f requirements.txt --output requirements.txt
then run the google-app-engine-deploy pipeline.
I understand the 'steps' are in different docker containers, so I can't run this in the previous step, correct?
After seeing this used in ssh-run for envsubst, I found out you can simply:
- pip install poetry
- poetry export --without-hashes -f requirements.txt --output requirements.txt
- pipe: atlassian/google-app-engine-deploy:0.2.1
Most examples with pipes have only a single 'line', and the step/script terminology was a bit confusing, but this works fine.
@Sander Land you can share your output files via artifacts section.
As for google app engine pipe, we are passing the volume (working dir) to the docker containers, so that you can share requirements.txt in artifacts section and in the next step you just run google app engine pipe in this dir where requirements is in.
The question will it be accepted by google app engine cloud is to google cloud: if they accept requirements file in that way, then yes, it should work
Keep contacting us and talk if more questions arise, we are happy to help you
Hi everyone, Are you Bitbucket DC customer? If so, we'd love to talk to you! Our team wants to dive deep to understand your long-term plans regarding Bitbucket DC and Atlassian Cloud. Do you plan...
Connect with like-minded Atlassian users at free events near you!Find an event
Connect with like-minded Atlassian users at free events near you!
Unfortunately there are no Community Events near you at the moment.Host an event
You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events