Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/scala/19.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala 如何处理SBT上的运行时依赖冲突?_Scala_Apache Spark_Sbt_Runtime Error_Sbt Assembly - Fatal编程技术网

Scala 如何处理SBT上的运行时依赖冲突?

Scala 如何处理SBT上的运行时依赖冲突?,scala,apache-spark,sbt,runtime-error,sbt-assembly,Scala,Apache Spark,Sbt,Runtime Error,Sbt Assembly,让我们把它放在上下文中 我在一个更广泛的程序(Spark/Scala)中运行了一行代码 此行导致以下异常 15/10/19 19:33:22 INFO impl.AMRMClientImpl: Waiting for application to be successfully unregistered. Exception in thread "Driver" java.lang.reflect.InvocationTargetException at sun.reflect.

让我们把它放在上下文中

我在一个更广泛的程序(Spark/Scala)中运行了一行代码

此行导致以下异常

15/10/19 19:33:22 INFO impl.AMRMClientImpl: Waiting for application to be successfully unregistered.
Exception in thread "Driver" java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:162)
Caused by: java.lang.NoClassDefFoundError: org/apache/spark/rdd/RDD$
        at Job$.main(Job.scala:55)
        at Job.main(Job.scala)
        ... 5 more
Caused by: java.lang.ClassNotFoundException: org.apache.spark.rdd.RDD$
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        ... 7 more
15/10/19 19:33:22 INFO yarn.ApplicationMaster$$anon$1: Invoking sc stop from shutdown hook
在build.sbt文件中,我有org/apache/spark/rdd/rdd的依赖项——与jar_home中的版本相同

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.1.0-cdh5.2.1" **intransitive()**

libraryDependencies += "org.apache.spark" % "spark-assembly_2.10" % "1.3.0-cdh5.4.0" **intransitive()**
我知道,intransitive()用于添加依赖项,仅用于编译,并允许使用jar_home中的jar执行。我还使用了“compile”和“provided”来代替intransitive()。。但是仍然会抛出相同的异常。我做了一些研究,建议使用分片——当然假设这是一种依赖冲突,但我使用的SBT 13x不支持分片策略。另一个是将我的应用程序修改为 依赖与Spark相同版本的第三方库。我确信我也这样做了


所以我现在别无选择。你能帮我解决这个问题吗?

我不知道
不及物()
,但我相信
“提供的”
是必需的。所以您需要
libraryDependencies+=“org.apache.spark”%”spark-core_2.10“%”1.1.0-cdh5.2.1“%”提供“
。我从未见过spark assembly作为一个依赖项。我使用了提供的和编译的。也许是火花组件和火花代码造成了冲突。我将取出组件,再试一次。谢谢你,我没有工作,@mehmetminanc
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.1.0-cdh5.2.1" **intransitive()**

libraryDependencies += "org.apache.spark" % "spark-assembly_2.10" % "1.3.0-cdh5.4.0" **intransitive()**