Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Earn badges and make progress

You're on your way to the next level! Join the Kudos program to earn points and save your progress.

Deleted user Avatar
Deleted user

Level 1: Seed

25 / 150 points

Next: Root


1 badge earned


Participate in fun challenges

Challenges come and go, but your rewards stay with you. Do more to earn more!


Gift kudos to your peers

What goes around comes around! Share the love by gifting kudos to your peers.


Rise up in the ranks

Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!


Come for the products,
stay for the community

The Atlassian Community can help you and your team get more value out of Atlassian products and practices.

Atlassian Community about banner
Community Members
Community Events
Community Groups

Problems with using pipes in a self hosted runner with deploying to beanstalk


I try to run the following pipeline:

This will cause the errors on bitbucket pipelines (docker: sed: write error). I dont know where this error comes from. It will work when i run this configuration on the bitbucket servers, by removing the runs-on keyword and variables. This will also deploy the right version to beanstalk.

Where does this error come from and how can i fix this? 


- step: test_pipeline
- 'self.hosted'
- 'linux'
- 'build'
name: Test
- pipe: atlassian/aws/elasticbeanstalk-deploy:1.0.2
AWS_DEFAULT_REGION: '<environment>'
ENVIRONMENT_NAME: '<environment>'
ZIP_FILE: '<some zip>'
S3_BUCKET: 's3bucket>'
WAIT: 'true'




[Edit] Other pipelines which dont need a pipe will run successful without problems

1 answer

1 accepted

0 votes
Answer accepted
Norbert C Atlassian Team Feb 23, 2022

Hi David,

Thank you for contacting Atlassian Support, my name is Norbert and I'm a Bitbucket Cloud Support Engineer, it's nice to meet with you! Welcome to the Atlassian Community! 

  • Can you let me know whether the configuration that you posted, is that your full bitbucket-pipelines.yml file?
  • Also, would it be possible for you to give us the runners logs from that timeperiod, when the build fails?

You can check the logs, by doing the following command on the host machine:

docker logs -f runner-runnerUUID 

 Please let us know, we're here to help.

Best Regards,
Atlassian Bitbucket Cloud Support

Hi @Norbert C

Thank you for reacting! Here you have the entire bitbucket-pipelines.yml and the docker log of a single execution.



- step:
- 'self.hosted'
- 'linux'
- 'build'
name: Test
- apt-get update
- apt-get install zip -y
- cd /opt/atlassian/pipelines/agent/build/Presentation/WeedIt.Web
- dotnet publish ./WeedIt.Web.csproj --output ./bin/Release/publish --configuration "Release" --framework "net5.0" --runtime linux-x64 --self-contained false
- cd ./bin/Release/publish
- mkdir YodaLogAdapter
- zip -r *
- pipe: atlassian/aws-elasticbeanstalk-deploy:1.0.2
ENVIRONMENT_NAME: 'weeditwebportal-dev'
S3_BUCKET: 'elasticbeanstalk-eu-west-1-891169162099'
WAIT: 'true'

 docker logs:

+ ./
[2022-02-23 10:59:31,491] Runner version: 1.299
[2022-02-23 10:59:31,526] Runner runtime: linux-docker
[2022-02-23 10:59:33,849] Copying Docker cli to working directory.
[2022-02-23 10:59:34,216] Starting websocket listening to RUNNER_UPDATED events.
[2022-02-23 10:59:34,330] Updating runner status to "ONLINE" and checking for new steps assigned to the runner after 0 seconds and then every 30 seconds.
[2022-02-23 10:59:34,652] Updating runner state to "ONLINE".
[2022-02-23 10:59:36,012] Setting runner state to executing step.
[2022-02-23 10:59:36,040] Getting step StepId{accountUuid={185b1843-7b9c-4a72-b9cd-039f9a71c2fd}, repositoryUuid={73d0b75b-662c-469d-879d-f300f845acfd}, pipelineUuid={9216ad25-7a8d-4d5b-86bb-31842dcf56a2}, stepUuid={cbc7ff92-35ef-4153-ae0f-d983fd685f8b}}.
[2022-02-23 10:59:36,053] Getting oauth token for step.
[2022-02-23 10:59:36,063] Getting environment variables for step.
[2022-02-23 10:59:36,665] Getting all artifacts for step.
[2022-02-23 10:59:36,682] Getting SSH private key.
[2022-02-23 10:59:36,692] Getting known hosts.
[2022-02-23 10:59:36,936] SSH private key not found
[2022-02-23 10:59:37,008] Setting up directories.
[2022-02-23 10:59:37,018] Starting log uploader.
[2022-02-23 10:59:37,037] Removing container 3e598b2c-cc41-5459-95a1-746449efe490_cbc7ff92-35ef-4153-ae0f-d983fd685f8b_clone
[2022-02-23 10:59:37,059] Removing container 3e598b2c-cc41-5459-95a1-746449efe490_cbc7ff92-35ef-4153-ae0f-d983fd685f8b_build
[2022-02-23 10:59:37,080] Setting up step timeout: PT2H
[2022-02-23 10:59:37,082] Starting websocket listening to STEP_COMPLETED events.
[2022-02-23 10:59:37,084] Checking for step completion every PT30S seconds.
[2022-02-23 10:59:37,301] Updating step progress to PULLING_IMAGES.
[2022-02-23 10:59:37,540] Pulling image
[2022-02-23 10:59:38,045] Pulling image
[2022-02-23 10:59:38,067] Appending log line to main log.
[2022-02-23 10:59:38,205] Pulling image
[2022-02-23 10:59:38,705] Pulling image
[2022-02-23 10:59:39,210] Pulling image
[2022-02-23 10:59:39,937] Removing container 3e598b2c-cc41-5459-95a1-746449efe490_cbc7ff92-35ef-4153-ae0f-d983fd685f8b_pause
[2022-02-23 10:59:39,944] Updating step progress to CLONING.
[2022-02-23 10:59:39,944] Creating container 3e598b2c-cc41-5459-95a1-746449efe490_cbc7ff92-35ef-4153-ae0f-d983fd685f8b_pause.
[2022-02-23 10:59:40,036] Starting container.
[2022-02-23 10:59:40,187] Generating clone script.
[2022-02-23 10:59:40,214] Creating container 3e598b2c-cc41-5459-95a1-746449efe490_cbc7ff92-35ef-4153-ae0f-d983fd685f8b_clone.
[2022-02-23 10:59:40,215] Executing clone script in clone container.
[2022-02-23 10:59:40,265] Starting container.
[2022-02-23 10:59:40,450] Removing container 3e598b2c-cc41-5459-95a1-746449efe490_cbc7ff92-35ef-4153-ae0f-d983fd685f8b_system_auth-proxy
[2022-02-23 10:59:40,455] Removing container 3e598b2c-cc41-5459-95a1-746449efe490_cbc7ff92-35ef-4153-ae0f-d983fd685f8b_system_docker
[2022-02-23 10:59:40,456] Creating container 3e598b2c-cc41-5459-95a1-746449efe490_cbc7ff92-35ef-4153-ae0f-d983fd685f8b_system_auth-proxy.
[2022-02-23 10:59:40,467] Creating container 3e598b2c-cc41-5459-95a1-746449efe490_cbc7ff92-35ef-4153-ae0f-d983fd685f8b_system_docker.
[2022-02-23 10:59:40,573] Starting container.
[2022-02-23 10:59:40,586] Starting container.
[2022-02-23 10:59:40,728] Adding container log: /var/lib/docker/containers/0e8aa4c11d77b48a10a847b791dfeb231b05b8cb68c819edf674640ae5310603/0e8aa4c11d77b48a10a847b791dfeb231b05b8cb68c819edf674640ae5310603-json.log
[2022-02-23 10:59:40,737] Waiting on container to exit.
[2022-02-23 10:59:40,790] Adding container log: /var/lib/docker/containers/9192aae2cc9cb2e2da993d7d6dd9d7c9ac12161eef5321ab6c13e594af88fd8e/9192aae2cc9cb2e2da993d7d6dd9d7c9ac12161eef5321ab6c13e594af88fd8e-json.log
[2022-02-23 10:59:40,796] Waiting on container to exit.
[2022-02-23 10:59:40,804] Adding container log: /var/lib/docker/containers/b8a5a2144407408f2ae65cca4e81818482915c1b96051563f15a9582cdb68455/b8a5a2144407408f2ae65cca4e81818482915c1b96051563f15a9582cdb68455-json.log
[2022-02-23 10:59:40,805] Waiting on container to exit.
[2022-02-23 10:59:40,806] Creating exec into container.
[2022-02-23 10:59:40,930] Starting exec into container and waiting for exec to exit.
[2022-02-23 10:59:41,011] Container has state (exitCode: Some(4), OOMKilled Some(false))
[2022-02-23 10:59:41,039] Removing container 3e598b2c-cc41-5459-95a1-746449efe490_cbc7ff92-35ef-4153-ae0f-d983fd685f8b_build
[2022-02-23 10:59:41,050] Appending log line to log: {34a32cc6-9cec-4c98-95f9-eceec417289b}.
[2022-02-23 10:59:41,050] Not uploading caches. (numberOfCaches: 0, resultOrError: FAILED)
[2022-02-23 10:59:41,053] Updating step progress to UPLOADING_ARTIFACTS.
[2022-02-23 10:59:41,081] Appending log line to log: {dcc487a5-c8fc-4537-819a-45b5f8ee4ec7}.
[2022-02-23 10:59:41,351] Updating step progress to PARSING_TEST_RESULTS.
[2022-02-23 10:59:41,586] Test report processing complete.
[2022-02-23 10:59:41,586] Removing container 3e598b2c-cc41-5459-95a1-746449efe490_cbc7ff92-35ef-4153-ae0f-d983fd685f8b_clone
[2022-02-23 10:59:41,815] Removing container 3e598b2c-cc41-5459-95a1-746449efe490_cbc7ff92-35ef-4153-ae0f-d983fd685f8b_clone
[2022-02-23 10:59:41,822] Removing container 3e598b2c-cc41-5459-95a1-746449efe490_cbc7ff92-35ef-4153-ae0f-d983fd685f8b_build
[2022-02-23 10:59:41,828] Removing container 3e598b2c-cc41-5459-95a1-746449efe490_cbc7ff92-35ef-4153-ae0f-d983fd685f8b_system_auth-proxy
[2022-02-23 10:59:41,993] Removing container 3e598b2c-cc41-5459-95a1-746449efe490_cbc7ff92-35ef-4153-ae0f-d983fd685f8b_system_docker
[2022-02-23 10:59:42,006] Removing container 3e598b2c-cc41-5459-95a1-746449efe490_cbc7ff92-35ef-4153-ae0f-d983fd685f8b_pause
[2022-02-23 10:59:42,058] Appending log line to main log.
[2022-02-23 10:59:42,227] Updating step progress to COMPLETING_LOGS.
[2022-02-23 10:59:42,478] Shutting down log uploader.
[2022-02-23 10:59:42,484] Tearing down directories.
[2022-02-23 10:59:42,488] Cancelling timeout
[2022-02-23 10:59:42,493] Completing step with result Result{status=FAILED, error=None}.
[2022-02-23 10:59:42,766] Setting runner state to not executing step.
[2022-02-23 10:59:42,771] Waiting for next step.


I hope this information will help and if you need more information i will supply that info.

Hi @David Bergevoet ,

is your runners host debian 11?

I had the same error and this workaround fixed it for me: 

Best regards


Like David Bergevoet likes this

Hi @Fabian

Yes the runner indeed hosts debian 11. I'll try your workaround and will check if it works for me, probably next monday.

Thank you for suggesting a workaround and have a good weekend!

Best regards,


Like Fabian likes this
Norbert C Atlassian Team Feb 27, 2022

Hi @Fabian 

Thank you for answering David's question.

@David Bergevoet please let us know whether David's suggestion is working for you?

Best Regards,
Atlassian Bitbucket Cloud Support

Like David Bergevoet likes this

Hi @David Bergevoet ,

does the workaround fixed your issue?

Best regards,

Like David Bergevoet likes this

Hi @Norbert C  & @Fabian

This workaround works for me! Sorry for responding this late, but there were some things with more priority then this issue. Thank you both for responding and helping me out.

Best regards,


Norbert C Atlassian Team Mar 07, 2022

Hi David,

Thank you for your reply, I'm glad to hear your issue is resolved.

Thanks @Fabian for your assistance on this issue.

Have a great day!

Best Regards,
Atlassian Bitbucket Cloud Support

Suggest an answer

Log in or Sign up to answer

Atlassian Community Events