SSH Key Management.

Daniel Fuentes January 4, 2024

I'm working with the Bitbucket Pipelines

In the deployment step, I'm connecting to two servers (Integration/QA) to execute this step using an SSH key.

What is the best practice regarding SSH Key Management
1 SSH key per repo-server.
1 SSH key ("MASTER") for all repos-servers?

 

D6ygxw3T64.png

1 answer

0 votes
Theodora Boudale
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
January 8, 2024

Hi Daniel!

Using one SSH key for all repos-servers may be easier to set up. However, if the key is compromised (e.g. someone leaks it intentionally or unintentionally), a bad actor may get access to both servers. You would then need to replace the SSH key in all repos and all servers.

You could use one key per server for all repos (so two SSH key pairs in total). If a key becomes compromised, a bad actor may get access to one server only and you would need to replace it in all four repos and in the server where it belonged.

Using one key per repo-server (eight SSH key pairs) is a bit more work. If a key becomes compromised, a bad actor may get access to only one server. The key would need to be replaced in one repo and one server.

The second and third options may be best to minimize the impact of a potential leak.

Kind regards,
Theodora

Daniel Fuentes January 8, 2024

Thank you very much, Theodora. I get it now.

I was reading in the atlassian documentation about how to do the one key per repo-server approach using the my_know_hosts file where I need to add that file to each repo with all private keys.

That approach is also too risky specially if the repository is public.

Is there another way to accomplish the one key per repo-server approach?

Once again, thank you very much for your time and support. I really appreciate it.

Theodora Boudale
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
January 9, 2024

Hi Daniel,

I checked the documentation and adding a my_known_hosts file to the repo shouldn't be necessary. You can add the servers' domain names or IP addresses on the repo from Repository settings > SSH keys in the Known hosts section and fetch their fingerprints. If you do this, the fingerprints will be added to the ~/.ssh/known_hosts file that is created in the Docker container where the build runs. I'm not sure why this section is there in the documentation, I will need to reach out internally to find out and update it.

If you use only one SSH key pair, this can be added to Repository settings > SSH keys on the UI. If you use more than one, then at least one of them needs to be added as a variable after you base64 encode it (the private key) and then decode it during the build. There are no other options.

The risk of the my_know_hosts file being public can be prevented by not adding the file to the repo but by following the steps I mentioned earlier.

Please be mindful that no matter if the SSH key is added to Repository settings or as a variable, it can still be accessed by all users with write access to the repo if they add a cat command in the yml file.

If the steps that deploy have been defined as deployment steps as per this doc and you're on the premium plan, you can use deployment permissions and enable the option Only allow admins to deploy to this environment so that an admin can review the yml file before starting the deployment:

Please feel free to reach out if you have any questions.

Kind regards,
Theodora

Suggest an answer

Log in or Sign up to answer
DEPLOYMENT TYPE
CLOUD
TAGS
AUG Leaders

Atlassian Community Events