Are you in the loop? Keep up with the latest by making sure you're subscribed to Community Announcements. Just click Watch and select Articles.

×
Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in
Celebration

Earn badges and make progress

You're on your way to the next level! Join the Kudos program to earn points and save your progress.

Deleted user Avatar
Deleted user

Level 1: Seed

25 / 150 points

Next: Root

Avatar

1 badge earned

Collect

Participate in fun challenges

Challenges come and go, but your rewards stay with you. Do more to earn more!

Challenges
Coins

Gift kudos to your peers

What goes around comes around! Share the love by gifting kudos to your peers.

Recognition
Ribbon

Rise up in the ranks

Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!

Leaderboard

Bitbucket Runner - selfhosted on Docker - non-root

Hey, 

I want to the bitbucket runner with a non-root account adding the user-flag on start. 

My docker-compose file looks similar to this: 

 

version: '3.7'

services:

  bitbucket-runner-igon:

    image: docker-public.packages.atlassian.com/sox/atlassian/bitbucket-pipelines-runner:1

    container_name: ${RUNNER1_NAME}

    user: 1001:999

    restart: always

    network_mode: "host"

    security_opt:

      - no-new-privileges:true

    environment:

      - RUNTIME_PREREQUISITES_ENABLED=true

      - WORKING_DIRECTORY=/tmp

      - ACCOUNT_UUID=${RUNNER1_ACCOUNT_UUID}

      - RUNNER_UUID=${RUNNER1_RUNNER_UUID}

      - OAUTH_CLIENT_ID=${RUNNER1_OAUTH_CLIENT_ID}

      - OAUTH_CLIENT_SECRET=${RUNNER1_OAUTH_CLIENT_SECRET}

    volumes:

      - ${RUNNER_WORKING_DIRECTORY}/${RUNNER1_NAME}:/tmp

      - /var/run/docker.sock:/var/run/docker.sock

      - /var/lib/docker/containers:/var/lib/docker/containers:ro
Wheres my .env File looks like the followinG:


RUNNER_WORKING_DIRECTORY=/srv/bitbucket-runner-docker/workingdirectory

RUNNER1_NAME=runner-RUNNER1

RUNNER1_ACCOUNT_UUID={00000000-0000-0000-0000-000000000000}

RUNNER1_RUNNER_UUID={00000000-0000-0000-0000-000000000000}

RUNNER1_OAUTH_CLIENT_ID=<SomeValidOauthClientId>

RUNNER1_OAUTH_CLIENT_SECRET=<SomeValidOauthClientSecret>

I added a new user to the Linux VM that has UID 1001. 
I added the user to the docker group, docker group GID is 999.
I changed group for /var/lib/docker/containers and subfolders to be docker instead of root.
I changed owner of Runner-WorkingDirectory and Subfolders to be user with UID 1001. 
The runner container starts, but during a pipeline run I get the following error: 

[2023-02-10 12:03:04,828] Pulling image bitbucketpipelines/azure-aks-deploy:latest.
[2023-02-10 12:03:04,828] Looking for auth in config for image Image{name=bitbucketpipelines/azure-aks-deploy:latest, runAsUser=None, auth=None} and found auth null
[2023-02-10 12:03:05,765] Pulling image docker-public.packages.atlassian.com/sox/atlassian/bitbucket-pipelines-docker-daemon:v20.10.18-prod-stable.
[2023-02-10 12:03:05,765] Looking for auth in config for image Image{name=docker-public.packages.atlassian.com/sox/atlassian/bitbucket-pipelines-docker-daemon:v20.10.18-prod-stable, runAsUser=None, auth=None} and found auth null
[2023-02-10 12:03:06,308] Pulling image docker-public.packages.atlassian.com/sox/atlassian/bitbucket-pipelines-auth-proxy:prod-stable.
[2023-02-10 12:03:06,309] Looking for auth in config for image Image{name=docker-public.packages.atlassian.com/sox/atlassian/bitbucket-pipelines-auth-proxy:prod-stable, runAsUser=None, auth=None} and found auth null
[2023-02-10 12:03:06,965] Pulling image k8s-docker.packages.atlassian.com/pause:3.8.
[2023-02-10 12:03:06,965] Looking for auth in config for image Image{name=k8s-docker.packages.atlassian.com/pause:3.8, runAsUser=None, auth=None} and found auth null
[2023-02-10 12:03:07,722] Removing container 9d53fcf6-f0b2-5972-bce6-19881f318a26_7fefd5ba-75cc-4672-afdf-09de73251b18_pause
[2023-02-10 12:03:07,726] Updating step progress to CLONING.
[2023-02-10 12:03:07,727] Creating container 9d53fcf6-f0b2-5972-bce6-19881f318a26_7fefd5ba-75cc-4672-afdf-09de73251b18_pause.
[2023-02-10 12:03:07,984] Generating clone script.
[2023-02-10 12:03:07,987] Creating container 9d53fcf6-f0b2-5972-bce6-19881f318a26_7fefd5ba-75cc-4672-afdf-09de73251b18_clone.
[2023-02-10 12:03:07,987] Executing clone script in clone container.
[2023-02-10 12:03:20,517] Updating runner state to "ONLINE".
[2023-02-10 12:03:31,055] Starting container.
[2023-02-10 12:03:31,059] Starting container.
[2023-02-10 12:03:49,673] Removing container 9d53fcf6-f0b2-5972-bce6-19881f318a26_7fefd5ba-75cc-4672-afdf-09de73251b18_system_docker
[2023-02-10 12:03:49,682] Creating container 9d53fcf6-f0b2-5972-bce6-19881f318a26_7fefd5ba-75cc-4672-afdf-09de73251b18_system_docker.
[2023-02-10 12:03:49,701] Removing container 9d53fcf6-f0b2-5972-bce6-19881f318a26_7fefd5ba-75cc-4672-afdf-09de73251b18_system_auth-proxy
[2023-02-10 12:03:49,706] Creating container 9d53fcf6-f0b2-5972-bce6-19881f318a26_7fefd5ba-75cc-4672-afdf-09de73251b18_system_auth-proxy.
[2023-02-10 12:03:50,416] Container log not found: /var/lib/docker/containers/4a6669bffce66dfb6b063fe6ae9fce876d86d5b0f2249188b9b5cb76648fd249/4a6669bffce66dfb6b063fe6ae9fce876d86d5b0f2249188b9b5cb76648fd249-json.log
[2023-02-10 12:03:50,416] Waiting on container to exit.
[2023-02-10 12:03:50,417] Creating exec into container.
[2023-02-10 12:03:50,517] Updating runner state to "ONLINE".
[2023-02-10 12:03:52,691] Starting container.
[2023-02-10 12:03:53,185] An error occurred whilst creating container exec.
com.github.dockerjava.api.exception.ConflictException: Status 409: {"message":"Container 4a6669bffce66dfb6b063fe6ae9fce876d86d5b0f2249188b9b5cb76648fd249 is not running"}

at com.github.dockerjava.netty.handler.HttpResponseHandler.channelRead0(HttpResponseHandler.java:101)
at com.github.dockerjava.netty.handler.HttpResponseHandler.channelRead0(HttpResponseHandler.java:32)
at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.handler.logging.LoggingHandler.channelRead(LoggingHandler.java:280)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.channel.CombinedChannelDuplexHandler$DelegatingChannelHandlerContext.fireChannelRead(CombinedChannelDuplexHandler.java:436)
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:336)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:308)
at io.netty.channel.CombinedChannelDuplexHandler.channelRead(CombinedChannelDuplexHandler.java:251)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:800)
at io.netty.channel.epoll.EpollDomainSocketChannel$EpollDomainUnsafe.epollInReady(EpollDomainSocketChannel.java:138)
at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:499)
at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:397)
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.base/java.lang.Thread.run(Unknown Source)
[2023-02-10 12:03:53,188] Removing container 9d53fcf6-f0b2-5972-bce6-19881f318a26_7fefd5ba-75cc-4672-afdf-09de73251b18_build
[2023-02-10 12:03:53,191] Not uploading caches. (numberOfCaches: 0, resultOrError: ERROR)
[2023-02-10 12:03:53,201] Not uploading artifacts. (numberOfArtifacts: 0, resultOrError: ERROR)
[2023-02-10 12:03:53,202] Updating step progress to PARSING_TEST_RESULTS.
[2023-02-10 12:03:53,439] Test report processing complete.
[2023-02-10 12:03:53,439] Removing container 9d53fcf6-f0b2-5972-bce6-19881f318a26_7fefd5ba-75cc-4672-afdf-09de73251b18_clone
[2023-02-10 12:03:53,466] Removing container 9d53fcf6-f0b2-5972-bce6-19881f318a26_7fefd5ba-75cc-4672-afdf-09de73251b18_clone
[2023-02-10 12:03:53,469] Removing container 9d53fcf6-f0b2-5972-bce6-19881f318a26_7fefd5ba-75cc-4672-afdf-09de73251b18_build
[2023-02-10 12:03:53,478] Removing container 9d53fcf6-f0b2-5972-bce6-19881f318a26_7fefd5ba-75cc-4672-afdf-09de73251b18_system_docker
[2023-02-10 12:03:53,542] Removing container 9d53fcf6-f0b2-5972-bce6-19881f318a26_7fefd5ba-75cc-4672-afdf-09de73251b18_system_auth-proxy
[2023-02-10 12:03:53,750] Appending log line to main log.
[2023-02-10 12:03:54,813] Removing container 9d53fcf6-f0b2-5972-bce6-19881f318a26_7fefd5ba-75cc-4672-afdf-09de73251b18_pause
[2023-02-10 12:03:56,916] Updating step progress to COMPLETING_LOGS.
[2023-02-10 12:03:57,216] Shutting down log uploader.
[2023-02-10 12:03:57,505] Tearing down directories.
[2023-02-10 12:03:57,506] Cancelling timeout
[2023-02-10 12:03:57,507] Completing step with result Result{status=ERROR, error=Some(Error{key='runner.bitbucket-pipelines.clone-container-failure', message='Status 409: {"message":"Container 4a6669bffce66dfb6b063fe6ae9fce876d86d5b0f2249188b9b5cb76648fd249 is not running"}
', arguments={}})}.
[2023-02-10 12:03:57,773] Setting runner state to not executing step.
[2023-02-10 12:03:57,773] Waiting for next step.

Is there a (documented) way of running the Bitbucket Runner within docker with a non-root user? 

0 answers

Suggest an answer

Log in or Sign up to answer
DEPLOYMENT TYPE
CLOUD
PERMISSIONS LEVEL
Site Admin
TAGS
AUG Leaders

Atlassian Community Events