如何为Spark编译Java?
我想把Java文件编译成jar。我希望它能由Spark运行。我确实试着正常编译,但它有这样一个错误如何为Spark编译Java?,java,apache-spark,Java,Apache Spark,我想把Java文件编译成jar。我希望它能由Spark运行。我确实试着正常编译,但它有这样一个错误 java.lang.NoClassDefFoundError: JavaWordCount (wrong name: org/apache/spark/examples/JavaWordCount) at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoade
java.lang.NoClassDefFoundError: JavaWordCount (wrong name: org/apache/spark/examples/JavaWordCount)
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:229)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:700)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
这就是我编译Java的方式
javac -classpath spark-sql_2.11-2.1.1.jar:spark-core_2.11-2.1.1.jar:scala-compiler-2.11.8.jar:scala-library-2.11.8.jar JavaWordCount.java
这就是我制作jar文件的方法
jar cvf JavaWordCount.jar JavaWordCount*.class
然而,当我试图激发提交时,这样做出现了上面的错误
spark-submit --class JavaWordCount JavaWordCount.jar README.md
我也尝试将类更改为org.apache.spark.examples.JavaWordCount,但它仍然给我相同的错误
我哪里出错了?有什么建议吗?
PS我在Spark文件夹中使用了一个示例JavaWordCount。我只需要
javac
和所有Spark jar文件(只需要一个)就解决了这个问题,当我再次使用jar cvf
和Spark jar文件时,我在发布问题时没有这样做