Apache spark java.lang.ClassNotFoundException:org.apache.spark.examples.SparkPi

Apache spark java.lang.ClassNotFoundException:org.apache.spark.examples.SparkPi,apache-spark,Apache Spark,我试图在linux shell中运行命令来运行spark示例,我的spark版本是spark-2.2.3-bin-hadoop2.7。我的spark cluster正在运行。所以我不明白为什么错了。以下是我的命令: ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master spark://node01:7077,node02:7077,node03:7077 --executor-memory 1G --total

我试图在linux shell中运行命令来运行spark示例,我的spark版本是spark-2.2.3-bin-hadoop2.7。我的spark cluster正在运行。所以我不明白为什么错了。以下是我的命令:

./bin/spark-submit --class org.apache.spark.examples.SparkPi --master spark://node01:7077,node02:7077,node03:7077 --executor-memory 1G --total-executor-cores 2 /export/servers/spark-2.2.3-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.2.3.jar 100

java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:348)
    at org.apache.spark.util.Utils$.classForName(Utils.scala:233)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:732)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

我找到我的任务了!因为我的spark安装目录的用户和组不是root。我可以在更改为root后成功运行。