Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

self hosted runner mount ssh key

Deleted user December 16, 2021

I am trying to move my bitbucket pipelines to self hosted runners.  I needs ssh keys.  In hosted runners ssh key is mounted automatically.  With self hosted runners, I did not find my id_rsa.  So I added my private key and mounted it to my dind container and runner container (I can see the keys when I ssh into the pods). However when I run a pipeline that volume mount does not exist.  How do I mount kubernetes secret into a self hosted pipeline runner?

2 answers

2 votes
Edwin Knese
I'm New Here
I'm New Here
Those new to the Atlassian Community have posted less than three times. Give them a warm welcome!
June 21, 2022

If somebody comes here with the same problem, I recently found a bug, that sill exists in v1.326


We were getting an error, while accessing our private repo during the pipeline build, but only on local runners, cloud was working
The error was something like this:


Load key /tmp/<uuid>/ssh/id_rsa: invalid format


The answer ist, that the added ssh-key under settings, need to have a newline at the end, without one, the local runners aren't able to load the ssh-key as mentioned here. 
 
Hopefully I helped someone, with this tip :)

Przemyslaw Nowak
I'm New Here
I'm New Here
Those new to the Atlassian Community have posted less than three times. Give them a warm welcome!
September 16, 2022

@Edwin Knese we had exactly the same issue as you. I've tried the Kubernets and VM Linux runners and in both cases there was that problem.

On public runners the issue doesn't exist with the exactly same SSH key so BitBucket doing some adjustment on their machines.

Problem still exist on v1.363 runner.

@Caroline R please open the internal ticket to fix it or at least put some comments in runners configuration because there is so much waste time to looking solution for this (and from @Edwin Knese comments I assume is a known issue).

Thanks

Theodora Boudale
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
September 19, 2022

Hi @Edwin Knese and @Przemyslaw Nowak,

I am unable to reproduce the issue you are reporting. If you're still experiencing this, please create a new question via https://community.atlassian.com/t5/forums/postpage/board-id/bitbucket-questions and provide some more details like the following, and we can look into this

- what type of runners you are using (Linux Docker or Linux Shell) and what version
- whether you are adding SSH keys via Repository settings > SSH keys or if you're using Repository variables
- your bitbucket-pipelines.yml configuration

Kind regards,
Theodora

lukas
I'm New Here
I'm New Here
Those new to the Atlassian Community have posted less than three times. Give them a warm welcome!
November 13, 2022

@Edwin Knese Wow, you really helped me. I added the newline to the key and suddenly it is working!

Like burroz likes this
Marcus March 12, 2024

I have the same problem with runner version: 1.561.

We want to make a git submodule update and get this error:

Load key "/tmp/bitbucket-runner-72/028f872c-e360-5430-9ee8-af9bb1904f7b/ssh/id_rsa": invalid format

 

I looked at the private key that is located there and it seems like the line break at the end is missing.

So how did you add those new line at the end? In the bitbucket UI i'm not able to change the private key and in the pipeline i'm not able to access those file.

Marcus March 12, 2024

Okay i just had to delete the old ssh key in the bitbucket ui and create a new one. the new public key has a different format, seems like bitbucket internally a new type of key is generated.

Like Garrett_LoVerde likes this
0 votes
Caroline R
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
December 17, 2021

Hi, @[deleted]! 

Thank you for reaching out to Atlassian Community!

About the first part of your question, if you are referring to the ssh from Repository settings > SSH keys under the Pipelines section, then I would like to ask you to confirm if your Runner is up to date with the latest version, as we have released support for SSH keys and known hosts on runners in latest versions. 

In order to upgrade it, you can stop the runner and use the below command to get the last version: 

docker image pull docker-public.packages.atlassian.com/sox/atlassian/bitbucket-pipelines-runner:1

Then, about the second part of your question, when you say you ran a pipeline and that volume mount didn’t exist, and how to mount Kubernetes secret into a self-hosted pipeline runner, could you please give us more details about your request? You can share your configuration with us, so we can properly assist you.

Looking forward to hearing from you.

Kind regards,
Caroline

Deleted user December 19, 2021

Thanks for the reply.  So yes I am referring to the ssh key in the ssh keys pipelines.

This is the image I used:

docker-public.packages.atlassian.com/sox/atlassian/bitbucket-pipelines-runner

I imagine that is the latest.


if I run the pipeline via bitbucket I see the ssh key in 

/opt/atlassian/pipelines/agent/ssh/id_rsa.

if i run it through the self hosted runner I do not see it there.
I have mounted the k8s secret with the id_rsa but I would prefer to get it from bitbucket directly.  so I see my id_rsa in  /tmp/ssh but that is from me updating it outside of bitbucket.  

So my goal would be to run the job in my own network using the ssh key passed on from the bitbucket pipeline (1 location for key pair maintenance vs 2).  And my pipeline does a docker build so would love to see the id_rsa in the container that does the docker build so I can pull the library repo we manage.
my very insecure way to manage that today is to get the id_rsa as an artifact and pass it to the runner but that is not what I want long term. This is only a POC at this point.

Below is my runner k8s yaml and the pipeline

###
Pipeline
image: atlassian/default-image:3
definitions:
steps:
- step: &QueryECR
name: query ecr
runs-on:
- 'LOCAL RUNNER'
image: atlassian/pipelines-awscli
services:
- docker
script:
# aws login
- echo "YES" > does_exist
- eval $(aws ecr get-login --region ${AWS_DEFAULT_REGION} --no-include-email)
- aws ecr describe-images --repository-name $image_name --image-ids imageTag=${BITBUCKET_COMMIT} || echo "NO" > does_exist
artifacts:
- does_exist
- step: &GetRSA
name: get rsa
script:
- cat /opt/atlassian/pipelines/agent/ssh/id_rsa > id_rsa
- chmod 400 id_rsa
artifacts:
- id_rsa
- step: &BuildAndPush
name: Push image to ECR
runs-on:
- 'LOCAL RUNNER'
deployment: Development
script:
- if [ `cat does_exist` == "YES" ]; then exit 0 ; fi
# build the image
- docker build -t $image_name .

# use the pipe to push the image to AWS ECR
- pipe: atlassian/aws-ecr-push-image:1.4.2
variables:
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
IMAGE_NAME: $image_name
TAGS: '${BITBUCKET_COMMIT}'
services:
- docker
pipelines:
tags:
dev3:
- step: *QueryECR
- step: *GetRSA
- step: *BuildAndPush
perf:
- step: *QueryECR
- step: *GetRSA
- step: *BuildAndPush


###K8s

apiVersion: batch/v1

kind: Job

metadata:

  name: runner

spec:

  template:

    metadata:

      labels:

        accountUuid: FOO
        runnerUuid: FOO

        repositoryUuid: FOO

    spec:

      containers:

        - name: bitbucket-k8s-runner

          image: docker-public.packages.atlassian.com/sox/atlassian/bitbucket-pipelines-runner

          env:

            - name: ACCOUNT_UUID

              value: "FOO"

            - name: RUNNER_UUID

              value: "FOO"

             - name: REPOSITORY_UUID

              value: ""FOO""

            - name: OAUTH_CLIENT_ID

              valueFrom:

                secretKeyRef:

                  name: runner-oauth-credentials

                  key: oauthClientId

            - name: OAUTH_CLIENT_SECRET

              valueFrom:

                secretKeyRef:

                  name: runner-oauth-credentials

                  key: oauthClientSecret

            - name: WORKING_DIRECTORY

              value: "/tmp"

          volumeMounts:

            - name: tmp

              mountPath: /tmp

            - name: docker-containers

              mountPath: /var/lib/docker/containers

              readOnly: true

            - name: var-run

              mountPath: /var/run

            - name: ssh-key

              readOnly: true

              mountPath: /tmp/ssh

        - name: docker-in-docker

          image: docker:20.10.7-dind

          securityContext:

            privileged: true

          volumeMounts:

            - name: tmp

              mountPath: /tmp

            - name: docker-containers

              mountPath: /var/lib/docker/containers

            - name: var-run

              mountPath: /var/run

            - name: ssh-key

              readOnly: true

              mountPath: /tmp/ssh

      restartPolicy: OnFailure

      volumes:

        - name: tmp

        - name: docker-containers

        - name: var-run

        - name: ssh-key

          secret:

            secretName: ssh-key

  backoffLimit: 6

  completions: 1

  parallelism: 1




Like Bonnie Milian likes this
Theodora Boudale
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
December 20, 2021

Hi Olivier,

Please allow me to step in as Caroline is out of office.

The image you are using for the runner is the correct one, however, you may need to update it to the latest version. If you open the page of a Pipelines build on Bitbucket website and select a step that is running on your runner, select the word 'Runner' at the beginning of the log and you will see info about the Runner version. The latest one at the moment is 1.262.

If you are running an older version, you can upgrade it following the steps that Caroline mentioned in her previous reply. The latest version of the runner supports SSH keys that are added to a repo's Repository settings > SSH keys under the Pipelines section.

Please note that when you run a Pipelines build on your runner, the SSH key will not be located at /opt/atlassian/pipelines/agent/ssh/.

The path where the private SSH key is located includes the runner uuid which I'm afraid is not available during the build, and it will need to be hardcoded in your yml file. You can find it the following way:

- Run the command docker ps on the machine where the runner is running
- Find in the output the container for the Runner, and note its name, it should be something like runner-76b247e7-b925-5e7b-9da2-1cda14c4ff2c

For the example above, if the name of the runner is runner-76b247e7-b925-5e7b-9da2-1cda14c4ff2c and you have defined SSH keys in your Repository's settings and you are running the latest version of the runner, the private SSH key should be located at

/tmp/76b247e7-b925-5e7b-9da2-1cda14c4ff2c/ssh/id_rsa

If you have any questions, please feel free to let me know.

Kind regards,
Theodora

Like Sabine Mayer likes this

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events