Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark kubernetes上的SparkPi-无法找到或加载主类?_Apache Spark_Kubernetes_Spark Submit - Fatal编程技术网

Apache spark kubernetes上的SparkPi-无法找到或加载主类?

Apache spark kubernetes上的SparkPi-无法找到或加载主类?,apache-spark,kubernetes,spark-submit,Apache Spark,Kubernetes,Spark Submit,我正在尝试在kubernetes集群上启动一个标准示例。 Spark submitt创建pod并失败,出现错误-“错误:无法找到或加载主类org.apache.Spark.examples.SparkPi” 火花提交 spark-submit \ --master k8s://https://k8s-cluster:6443 \ --deploy-mode cluster \ --name spark-pi \ --class org.apache.spark.examples.SparkPi

我正在尝试在kubernetes集群上启动一个标准示例。 Spark submitt创建pod并失败,出现错误-“错误:无法找到或加载主类org.apache.Spark.examples.SparkPi”

火花提交

spark-submit \
--master k8s://https://k8s-cluster:6443 \
--deploy-mode cluster \
--name spark-pi \
--class org.apache.spark.examples.SparkPi \
--conf spark.kubernetes.namespace=ca-app \
--conf spark.executor.instances=5 \
--conf spark.kubernetes.container.image=gcr.io/cloud-solutions-images/spark:v2.3.0-gcs \
--conf spark.kubernetes.authenticate.driver.serviceAccountName=default \
https://github.com/JWebDev/spark/blob/master/spark-examples_2.11-2.3.1.jar
Kubernetes在吊舱中创建了2个容器。spark init,其中写入了复制jar的示例

2018-07-22 15:13:35 INFO  SparkPodInitContainer:54 - Downloading remote jars: Some(https://github.com/JWebDev/spark/blob/master/spark-examples_2.11-2.3.1.jar,https://github.com/JWebDev/spark/blob/master/spark-examples_2.11-2.3.1.jar)
2018-07-22 15:13:35 INFO  SparkPodInitContainer:54 - Downloading remote files: None
2018-07-22 15:13:37 INFO  Utils:54 - Fetching https://github.com/JWebDev/spark/blob/master/spark-examples_2.11-2.3.1.jar to /var/spark-data/spark-jars/fetchFileTemp6219129583337519707.tmp
2018-07-22 15:13:37 INFO  Utils:54 - Fetching https://github.com/JWebDev/spark/blob/master/spark-examples_2.11-2.3.1.jar to /var/spark-data/spark-jars/fetchFileTemp8698641635325948552.tmp
2018-07-22 15:13:37 INFO  SparkPodInitContainer:54 - Finished downloading application dependencies.
斯帕克·库伯内特斯的司机,把错误扔给了我

+ readarray -t SPARK_JAVA_OPTS
+ '[' -n /var/spark-data/spark-jars/spark-examples_2.11-2.3.1.jar:/var/spark-data/spark-jars/spark-examples_2.11-2.3.1.jar ']'
+ SPARK_CLASSPATH=':/opt/spark/jars/*:/var/spark-data/spark-jars/spark-examples_2.11-2.3.1.jar:/var/spark-data/spark-jars/spark-examples_2.11-2.3.1.jar'
+ '[' -n /var/spark-data/spark-files ']'
+ cp -R /var/spark-data/spark-files/. .
+ case "$SPARK_K8S_CMD" in
+ CMD=(${JAVA_HOME}/bin/java "${SPARK_JAVA_OPTS[@]}" -cp "$SPARK_CLASSPATH" -Xms$SPARK_DRIVER_MEMORY -Xmx$SPARK_DRIVER_MEMORY -Dspark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS $SPARK_DRIVER_CLASS $SPARK_DRIVER_ARGS)
+ exec /sbin/tini -s -- /usr/lib/jvm/java-1.8-openjdk/bin/java -Dspark.app.id=spark-e032bc91fc884e568b777f404bfbdeae -Dspark.kubernetes.container.image=gcr.io/cloud-solutions-images/spark:v2.3.0-gcs -Dspark.kubernetes.namespace=ca-app -Dspark.jars=https://github.com/JWebDev/spark/blob/master/spark-examples_2.11-2.3.1.jar,https://github.com/JWebDev/spark/blob/master/spark-examples_2.11-2.3.1.jar -Dspark.driver.host=spark-pi-11f2cd9133b33fc480a7b2f1d5c2fcc0-driver-svc.ca-app.svc -Dspark.master=k8s://https://k8s-cluster:6443 -Dspark.kubernetes.initContainer.configMapName=spark-pi-11f2cd9133b33fc480a7b2f1d5c2fcc0-init-config -Dspark.kubernetes.authenticate.driver.serviceAccountName=default -Dspark.driver.port=7078 -Dspark.kubernetes.driver.pod.name=spark-pi-11f2cd9133b33fc480a7b2f1d5c2fcc0-driver -Dspark.app.name=spark-pi -Dspark.kubernetes.executor.podNamePrefix=spark-pi-11f2cd9133b33fc480a7b2f1d5c2fcc0 -Dspark.driver.blockManager.port=7079 -Dspark.submit.deployMode=cluster -Dspark.executor.instances=5 -Dspark.kubernetes.initContainer.configMapKey=spark-init.properties -cp ':/opt/spark/jars/*:/var/spark-data/spark-jars/spark-examples_2.11-2.3.1.jar:/var/spark-data/spark-jars/spark-examples_2.11-2.3.1.jar' -Xms1g -Xmx1g -Dspark.driver.bindAddress=10.233.71.5 org.apache.spark.examples.SparkPi
Error: Could not find or load main class org.apache.spark.examples.SparkPi

我做错了什么?谢谢你的提示。

我建议使用
https://github.com/JWebDev/spark/raw/master/spark-examples_2.11-2.3.1.jar
因为
/blob/
是资产的HTML视图,而
/raw/
将302重定向到它的实际存储URL

您是否尝试过
https://github.com/JWebDev/spark/raw/master/spark-examples_2.11-2.3.1.jar
因为
/blob/
是资产的HTML视图,而
/raw/
将302重定向到它的实际存储URL非常感谢!是的,这是个错误。你救了我,使我免于在寻找错误的过程中迷失了许多小时。