Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark SparkAppHandle状态未在Kubernetes中更新_Apache Spark_Kubernetes_Spark Launcher - Fatal编程技术网

Apache spark SparkAppHandle状态未在Kubernetes中更新

Apache spark SparkAppHandle状态未在Kubernetes中更新,apache-spark,kubernetes,spark-launcher,Apache Spark,Kubernetes,Spark Launcher,通过SparkAppHandle()启动Spark应用程序时,SparkAppHandle状态未得到更新 sparkLaunch = new SparkLauncher() .setSparkHome("/root/test/spark-2.4.0-bin-hadoop2.7") .setMaster("k8s://https://172.16.23.30:6443") .setVerbose(true) .addSparkArg("--verbose") .setAppResource("lo

通过
SparkAppHandle()
启动Spark应用程序时,
SparkAppHandle
状态未得到更新

sparkLaunch = new SparkLauncher()
.setSparkHome("/root/test/spark-2.4.0-bin-hadoop2.7")
.setMaster("k8s://https://172.16.23.30:6443")
.setVerbose(true)
.addSparkArg("--verbose")
.setAppResource("local:///opt/spark/examples/jars/spark-examples_2.11-2.4.0.jar")
.setConf("spark.app.name","spark-pi")
.setMainClass("org.apache.spark.examples.SparkPi")
.setConf("spark.executor.instances","5")
.setConf("spark.kubernetes.container.image","registry.renovite.com/spark:v2")
.setConf("spark.kubernetes.driver.pod.name","spark-pi-driver")
.setConf("spark.kubernetes.container.image.pullSecrets","dev-registry-key")
.setConf("spark.kubernetes.authenticate.driver.serviceAccountName","spark")
.setDeployMode("cluster")
;

SparkAppHandle handle = sparkLaunch.startApplication();

Observations:

Now, I tried listeners etc but handle.getState() returns UNKNOWN and when Spark application is completed. state changes to LOST.
SparkAppHandle is not null
handle.getAppId() is always null.
My best guess is that communication is not working properly between listener and Spark driver in Kubernetes.