Struggling to get a docker image to work in bitbucket pipelines where we want Hadoop HDFS on the image (required for the unit tests)
The simplest start for us was to test we could build a custom image with maven 3.6.3 and JDK 8 (match our dev maven / prod JDK) and see if we could get pipelines to run the build with that - it worked fine.
Then we added HDFS to the image, single node configuration (just for unit testing 1 node is all it needs). The image performs the basic install of HDFS, then the namenode format and start-dfs commands are running from the pipeline step.
It all seems to go well, then after the dfs is started we can see (using JPS) the namenode isn't running like we would expect, then the build starts and the first test fails when it can't see the hdfs server at localhost:9000.
There are no memory warnings, network warnings or anything else we might expect. The docker image runs fine on local machines.
We have experimented with moving the start command from the pipelines.yml to the image, but get the same result.
Open to ideas of deploying HDFS in different ways - we started with these at it seemed (at the time) the simplest way of achieving what we want.
Any assistance greatly appreciated, especially pointers to relevant documentation!
Hi everyone, We are looking to learn more about development teams’ workflows and pain points, especially around DevOps, integrations, administration, scale, security, and the related challeng...
Connect with like-minded Atlassian users at free events near you!Find an event
Connect with like-minded Atlassian users at free events near you!
Unfortunately there are no Community Events near you at the moment.Host an event
You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events