I created a bitbucket pipeline, that looks like this. A first step will simply do:
- /bin/sh scripts/createDeployConfigFile.sh
- terraform init -backend-config=./deploy.conf
- terraform validate
- terraform plan -out=tfplan
The second step is triggered manually and (should) simply do:
- terraform apply tfplan
artifacts:
- tfplan
- terraform.tfstate
- .terraform/**
- .terraform.lock.hcl
Hi @mc84 and welcome to the community!
If you run this Pipelines build on Atlassian's infrastructure, every step of the build runs on a different Docker container. For every step of the build, a Docker container starts. The repo is cloned in that container (unless you have disabled cloning), and then the commands of that step's script are executed. When the step finishes, the Docker container gets destroyed.
I'm not familiar with terraform, but is it possible that the commands in the first step are doing some additional tasks other than generating the files you have defined as artifacts?
When you run these commands on your workstation, I'm assuming that you execute them in the same environment. A second step in a Pipelines build runs in a separate Docker container, so in a different enviornment than the first step. I suggest looking into the exact tasks performed by the command you need to rerun in the second step, and maybe reach out to terraform's community or support team.
Kind regards,
Theodora
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.