In my repo, I have a `keys.py` file which I do not commit to Bitbucket. Inside that, I have AWS_SECRET and etc.
Then I do like `keys.AWS_SECRET` to access those keys.
How do I do that so that I can still test using pipelines without changing much to my codebase?
You can use a secured environment variable for to achieve this.
The variable value will be masked from all log output if you mark it as a "secured" variable on creation.
By the sounds of it, the only thing you will have to change is that you read in these values as an environment variable, instead of from a file.
I had added this as a comment to Philip's answer, but the code sections don't seem to display correctly in a comment.
If you really want to minimise the changes to your code you could try putting your entire "keys.py" file into a single secure variable by base64-encoding it.
base64 < keys.py
Paste the output of that command into the value of a secure variable named "KEYS_PY" and then in your bitbucket-pipelines.yml script:
echo $KEYS_PY | base64 --decode > keys.py
This may or may not work depending on how large "keys.py" is.
Badges are a great way to show off community activity, whether you’re a newbie or a Champion.Learn more
After spinning my wheels trying to get organized enough to write a book for National Novel Writing Month (NaNoWriMo) I took my affinity for Atlassian products from my work life and decided to tr...
Connect with like-minded Atlassian users at free events near you!Find a group
Connect with like-minded Atlassian users at free events near you!
Unfortunately there are no AUG chapters near you at the moment.Start an AUG
You're one step closer to meeting fellow Atlassian users at your local meet up. Learn more about AUGs