Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/327.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/scala/18.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
ERROR.ApplicationMaster:用户类引发异常:java.lang.NoClassDefFoundError:scala/Function0$class_Java_Scala_Apache Spark_Hadoop - Fatal编程技术网

ERROR.ApplicationMaster:用户类引发异常:java.lang.NoClassDefFoundError:scala/Function0$class

ERROR.ApplicationMaster:用户类引发异常:java.lang.NoClassDefFoundError:scala/Function0$class,java,scala,apache-spark,hadoop,Java,Scala,Apache Spark,Hadoop,我正试图通过ApacheLivy向hadoop纱线集群提交spark作业。使用指定的步骤设置群集 Java代码正在windows本地计算机上通过IntelliJ运行。spark和hadoop集群位于linux服务器上。其他应用程序(w/o Livy)在hdfs和spark计算上运行良好 我正在尝试运行在群集上应用程序的stderr中看到的错误日志: INFO yarn.ApplicationMaster: Waiting for spark context initialization... I

我正试图通过ApacheLivy
向hadoop纱线集群提交spark作业。使用指定的步骤设置群集

Java代码正在windows本地计算机上通过IntelliJ运行。spark和hadoop集群位于linux服务器上。其他应用程序(w/o Livy)在hdfs和spark计算上运行良好

我正在尝试运行在群集上应用程序的stderr中看到的错误日志:

INFO yarn.ApplicationMaster: Waiting for spark context initialization...
INFO driver.RSCDriver: Connecting to: master:10000
INFO driver.RSCDriver: Starting RPC server...
INFO rpc.RpcServer: Connected to the port 10001
WARN rsc.RSCConf: Your hostname, master, resolves to a loopback address, but we couldn't find any external IP address!
WARN rsc.RSCConf: Set livy.rsc.rpc.server.address if you need to bind to another address.
INFO driver.RSCDriver: Received job request 37e4684d-9de2-4a4b-9506-0b10a3e78a51
INFO driver.RSCDriver: SparkContext not yet up, queueing job request.
ERROR yarn.ApplicationMaster: User class threw exception: java.lang.NoClassDefFoundError: scala/Function0$class
java.lang.NoClassDefFoundError: scala/Function0$class
    at org.apache.livy.shaded.json4s.ThreadLocal.<init>(Formats.scala:311)
    at org.apache.livy.shaded.json4s.DefaultFormats$class.$init$(Formats.scala:318)
    at org.apache.livy.shaded.json4s.DefaultFormats$.<init>(Formats.scala:296)
    at org.apache.livy.shaded.json4s.DefaultFormats$.<clinit>(Formats.scala)
    at org.apache.livy.repl.Session.<init>(Session.scala:66)
    at org.apache.livy.repl.ReplDriver.initializeSparkEntries(ReplDriver.scala:41)
    at org.apache.livy.rsc.driver.RSCDriver.run(RSCDriver.java:333)
    at org.apache.livy.rsc.driver.RSCDriverBootstrapper.main(RSCDriverBootstrapper.java:93)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:684)
Caused by: java.lang.ClassNotFoundException: scala.Function0$class
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 13 more
INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 13, (reason: User class threw exception: java.lang.NoClassDefFoundError: scala/Function0$class
    at org.apache.livy.shaded.json4s.ThreadLocal.<init>(Formats.scala:311)
    at org.apache.livy.shaded.json4s.DefaultFormats$class.$init$(Formats.scala:318)
    at org.apache.livy.shaded.json4s.DefaultFormats$.<init>(Formats.scala:296)
    at org.apache.livy.shaded.json4s.DefaultFormats$.<clinit>(Formats.scala)
    at org.apache.livy.repl.Session.<init>(Session.scala:66)
    at org.apache.livy.repl.ReplDriver.initializeSparkEntries(ReplDriver.scala:41)
    at org.apache.livy.rsc.driver.RSCDriver.run(RSCDriver.java:333)
    at org.apache.livy.rsc.driver.RSCDriverBootstrapper.main(RSCDriverBootstrapper.java:93)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:684)
Caused by: java.lang.ClassNotFoundException: scala.Function0$class
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 13 more
)
livy.conf文件具有:

# What spark master Livy sessions should use.
livy.spark.master = yarn
# What spark deploy mode Livy sessions should use.
livy.spark.deployMode = cluster

如果我遗漏了什么,请指出一点好吗?

Livy似乎只支持基于Scala 2.11.x构建的Spark版本。看


更改您的
client.addJar(…
行,以包括Scala 2.11版本和基于2.11构建的Spark发行版。

您是对的,我正在尝试使用Scala 2.12的Spark 2.4.2。我应该检查这是否是由于支持问题。因此我想我们需要与Livy一起工作,直到他们支持Scala 2.12。
# What spark master Livy sessions should use.
livy.spark.master = yarn
# What spark deploy mode Livy sessions should use.
livy.spark.deployMode = cluster