Apache spark SparkAppHandle状态未在Kubernetes中更新
通过Apache spark SparkAppHandle状态未在Kubernetes中更新,apache-spark,kubernetes,spark-launcher,Apache Spark,Kubernetes,Spark Launcher,通过SparkAppHandle()启动Spark应用程序时,SparkAppHandle状态未得到更新 sparkLaunch = new SparkLauncher() .setSparkHome("/root/test/spark-2.4.0-bin-hadoop2.7") .setMaster("k8s://https://172.16.23.30:6443") .setVerbose(true) .addSparkArg("--verbose") .setAppResource("lo
SparkAppHandle()
启动Spark应用程序时,SparkAppHandle
状态未得到更新
sparkLaunch = new SparkLauncher()
.setSparkHome("/root/test/spark-2.4.0-bin-hadoop2.7")
.setMaster("k8s://https://172.16.23.30:6443")
.setVerbose(true)
.addSparkArg("--verbose")
.setAppResource("local:///opt/spark/examples/jars/spark-examples_2.11-2.4.0.jar")
.setConf("spark.app.name","spark-pi")
.setMainClass("org.apache.spark.examples.SparkPi")
.setConf("spark.executor.instances","5")
.setConf("spark.kubernetes.container.image","registry.renovite.com/spark:v2")
.setConf("spark.kubernetes.driver.pod.name","spark-pi-driver")
.setConf("spark.kubernetes.container.image.pullSecrets","dev-registry-key")
.setConf("spark.kubernetes.authenticate.driver.serviceAccountName","spark")
.setDeployMode("cluster")
;
SparkAppHandle handle = sparkLaunch.startApplication();
Observations:
Now, I tried listeners etc but handle.getState() returns UNKNOWN and when Spark application is completed. state changes to LOST.
SparkAppHandle is not null
handle.getAppId() is always null.
My best guess is that communication is not working properly between listener and Spark driver in Kubernetes.