Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Self-hosted Runner failing: An error occurred whilst creating container exec.

Jacky March 5, 2022

Hi,

I'm having issues when starting builds on the self-hosted runner with the following `error: An error occurred whilst creating container exec.`.

I've attached the docker file and the log from the runner.

Thanks!

 

Docker-file

```

services:
  runner:
    image: docker-public.packages.atlassian.com/sox/atlassian/bitbucket-pipelines-runner:1
    container_namerunner-b3cc0c7e-8970-5dbc-8163-cb6cb3591fca
    volumes:
      - "/tmp:/volume1/docker/runner/tmp"
      - "/var/run/docker.sock:/var/run/docker.sock"
      - "/var/lib/docker/containers:/var/lib/docker/containers:ro"
    environment:
      ACCOUNT_UUID
      RUNNER_UUID
"{b3cc0c7e-8970-5dbc-8163-cb6cb3591fca}"
      RUNTIME_PREREQUISITES_ENABLED: "true"
      OAUTH_CLIENT_ID
      OAUTH_CLIENT_SECRET
      WORKING_DIRECTORY: "/tmp"
```
Runner Log
```

[2022-03-06 05:33:50,537] Executing clone script in clone container.
[2022-03-06 05:33:51,159] Starting container.
[2022-03-06 05:33:51,595] Starting container.
[2022-03-06 05:33:55,111] Removing container b3cc0c7e-8970-5dbc-8163-cb6cb3591fca_5546bacb-0dc5-4216-9008-b233f88dd457_system_auth-proxy
[2022-03-06 05:33:55,120] Creating container b3cc0c7e-8970-5dbc-8163-cb6cb3591fca_5546bacb-0dc5-4216-9008-b233f88dd457_system_auth-proxy.
[2022-03-06 05:33:55,129] Removing container b3cc0c7e-8970-5dbc-8163-cb6cb3591fca_5546bacb-0dc5-4216-9008-b233f88dd457_system_docker
[2022-03-06 05:33:55,140] Creating container b3cc0c7e-8970-5dbc-8163-cb6cb3591fca_5546bacb-0dc5-4216-9008-b233f88dd457_system_docker.
[2022-03-06 05:33:55,954] Adding container log: /var/lib/docker/containers/92b5aaf254200f6969a3c3ae363396afe48eeec3f4c0e5d1be3d06354793c0fa/92b5aaf254200f6969a3c3ae363396afe48eeec3f4c0e5d1be3d06354793c0fa-json.log
[2022-03-06 05:33:55,973] Waiting on container to exit.
[2022-03-06 05:33:55,983] Creating exec into container.
[2022-03-06 05:33:56,294] Appending log line to main log.
[2022-03-06 05:33:56,523] Starting container.
[2022-03-06 05:33:57,736] Starting container.
[2022-03-06 05:33:58,198] Adding container log: /var/lib/docker/containers/abe317b382d74b25a35f4310395d546c5fc2187e17cf807cddb96c94a851b9f5/abe317b382d74b25a35f4310395d546c5fc2187e17cf807cddb96c94a851b9f5-json.log
[2022-03-06 05:33:58,200] Waiting on container to exit.
[2022-03-06 05:33:58,296] Appending log line to log: {b1fe7f2b-009f-4056-be25-981fe30fc8ef}.
[2022-03-06 05:33:58,651] An error occurred whilst creating container exec.
com.github.dockerjava.api.exception.ConflictException: Status 409: {"message":"Container 92b5aaf254200f6969a3c3ae363396afe48eeec3f4c0e5d1be3d06354793c0fa is not running"}

at com.github.dockerjava.netty.handler.HttpResponseHandler.channelRead0(HttpResponseHandler.java:101)
at com.github.dockerjava.netty.handler.HttpResponseHandler.channelRead0(HttpResponseHandler.java:32)
at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.handler.logging.LoggingHandler.channelRead(LoggingHandler.java:280)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.channel.CombinedChannelDuplexHandler$DelegatingChannelHandlerContext.fireChannelRead(CombinedChannelDuplexHandler.java:436)
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:324)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:296)
at io.netty.channel.CombinedChannelDuplexHandler.channelRead(CombinedChannelDuplexHandler.java:251)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:795)
at io.netty.channel.epoll.EpollDomainSocketChannel$EpollDomainUnsafe.epollInReady(EpollDomainSocketChannel.java:138)
at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:480)
at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:378)
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.base/java.lang.Thread.run(Unknown Source)
[2022-03-06 05:33:58,680] Removing container b3cc0c7e-8970-5dbc-8163-cb6cb3591fca_5546bacb-0dc5-4216-9008-b233f88dd457_build
[2022-03-06 05:33:58,698] Not uploading caches. (numberOfCaches: 1, resultOrError: ERROR)
[2022-03-06 05:33:58,703] Not uploading artifacts. (numberOfArtifacts: 1, resultOrError: ERROR)
[2022-03-06 05:33:58,704] Updating step progress to PARSING_TEST_RESULTS.
[2022-03-06 05:33:58,964] Test report processing complete.
[2022-03-06 05:33:58,965] Removing container b3cc0c7e-8970-5dbc-8163-cb6cb3591fca_5546bacb-0dc5-4216-9008-b233f88dd457_clone
[2022-03-06 05:33:59,290] Appending log line to main log.
[2022-03-06 05:33:59,589] Removing container b3cc0c7e-8970-5dbc-8163-cb6cb3591fca_5546bacb-0dc5-4216-9008-b233f88dd457_clone
[2022-03-06 05:33:59,595] Removing container b3cc0c7e-8970-5dbc-8163-cb6cb3591fca_5546bacb-0dc5-4216-9008-b233f88dd457_build
[2022-03-06 05:33:59,604] Removing container b3cc0c7e-8970-5dbc-8163-cb6cb3591fca_5546bacb-0dc5-4216-9008-b233f88dd457_system_auth-proxy
[2022-03-06 05:34:00,442] Removing container b3cc0c7e-8970-5dbc-8163-cb6cb3591fca_5546bacb-0dc5-4216-9008-b233f88dd457_system_docker
[2022-03-06 05:34:02,024] Removing container b3cc0c7e-8970-5dbc-8163-cb6cb3591fca_5546bacb-0dc5-4216-9008-b233f88dd457_pause
[2022-03-06 05:34:04,042] Updating step progress to COMPLETING_LOGS.
[2022-03-06 05:34:04,282] Shutting down log uploader.
[2022-03-06 05:34:04,287] Tearing down directories.
[2022-03-06 05:34:04,294] Cancelling timeout
[2022-03-06 05:34:04,303] Completing step with result Result{status=ERROR, error=Some(Error{key='runner.bitbucket-pipelines.clone-container-failure', message='Status 409: {"message":"Container 92b5aaf254200f6969a3c3ae363396afe48eeec3f4c0e5d1be3d06354793c0fa is not running"}
', arguments={}})}.
[2022-03-06 05:34:04,574] Setting runner state to not executing step.
[2022-03-06 05:34:04,575] Waiting for next step.

```

3 answers

0 votes
Nigel Sim June 6, 2023

I had the same issue. Sadly, the runner does not log enough detail to diagnose this easily, BUT, I for me the issues was the WORKING_DIRECTORY being out of wack with the mounted volumes.

From your example

 

    volumes:
      - "/tmp:/volume1/docker/runner/tmp"
    environment:
      WORKING_DIRECTORY"/tmp"
All of these need to be the same. My mental model is, that the runner shares the working directory with it's child containers, and because it doesn't know what the mount points are, it expects WORKING_DIRECTORY is the same inside and outside the container. So, this should work:
    volumes:
      - "/volume1/docker/runner/tmp:/volume1/docker/runner/tmp"
    environment:
      WORKING_DIRECTORY"/volume1/docker/runner/tmp"
I'm doing something similar on GCP, using /mnt/disks/bitbucket where that is the mount point of my data disk.
0 votes
Frédéric Nadeau
I'm New Here
I'm New Here
Those new to the Atlassian Community have posted less than three times. Give them a warm welcome!
April 6, 2022

I have the exact same issue. 

I do not use a dockerfile in contrast to Jacky, I run command as 

docker container run -it \
-v /tmp:/tmp \
-v /var/run/docker.sock:/var/run/docker.sock \
-v /var/lib/docker/containers:/var/lib/docker/containers:ro \
-e ACCOUNT_UUID=redacted  \
-e RUNNER_UUID=redacted \
-e RUNTIME_PREREQUISITES_ENABLED=true \
-e OAUTH_CLIENT_ID=redacted \
-e OAUTH_CLIENT_SECRET=redacted \
-e WORKING_DIRECTORY=/tmp \
--name runner-redacted \
docker-public.packages.atlassian.com/sox/atlassian/bitbucket-pipelines-runner:1
`

And logs shows

[2022-04-07 01:50:52,660] Waiting on container to exit. 
[2022-04-07 01:50:52,890] An error occurred whilst creating container exec.
com.github.dockerjava.api.exception.ConflictException: Status 409: {"message":"Container 4d84c270a8f5ebfdf1afbda8648a8a432e096ed83fb3da136ce36e405258ba5e is not running"}

       at com.github.dockerjava.netty.handler.HttpResponseHandler.channelRead0(HttpResponseHandler.java:101)
       at com.github.dockerjava.netty.handler.HttpResponseHandler.channelRead0(HttpResponseHandler.java:32)
       at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99)
       at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
       at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
       at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
       at io.netty.handler.logging.LoggingHandler.channelRead(LoggingHandler.java:280)
       at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
       at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
       at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
       at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
       at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
       at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
       at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
       at io.netty.channel.CombinedChannelDuplexHandler$DelegatingChannelHandlerContext.fireChannelRead(CombinedChannelDuplexHandler.java:436)
       at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:327)
       at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:299)
       at io.netty.channel.CombinedChannelDuplexHandler.channelRead(CombinedChannelDuplexHandler.java:251)
       at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
       at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
       at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
       at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
       at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
       at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
       at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
       at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:800)
       at io.netty.channel.epoll.EpollDomainSocketChannel$EpollDomainUnsafe.epollInReady(EpollDomainSocketChannel.java:138)
       at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:480)
       at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:378)
       at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
       at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
       at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
       at java.base/java.lang.Thread.run(Unknown Source)

Sometimes it doesn't crash, but pipeline that trigged the build is stuck in "cloning". Pass beyond the runner version (current/running) there is nothing even after 30 minutes.

0 votes
Ankit Gupta
Contributor
March 6, 2022

Hi,

Where is this dockerfile located? How did you run the self hosted runner from your pipeline?

If its something related to the runner, it makes sense to recreate it (delete in the settings and then re-run the container command).

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events