Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/unit-testing/4.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala UnsatifiedLinkError:在Intellij中运行Spark MLLib单元测试时,java.library.path中没有snappyjava_Scala_Unit Testing_Intellij Idea_Apache Spark - Fatal编程技术网

Scala UnsatifiedLinkError:在Intellij中运行Spark MLLib单元测试时,java.library.path中没有snappyjava

Scala UnsatifiedLinkError:在Intellij中运行Spark MLLib单元测试时,java.library.path中没有snappyjava,scala,unit-testing,intellij-idea,apache-spark,Scala,Unit Testing,Intellij Idea,Apache Spark,运行需要快速压缩的火花单元测试时,出现以下异常: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.Delegating

运行需要快速压缩的火花单元测试时,出现以下异常:

java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.xerial.snappy.SnappyLoader.loadNativeLibrary(SnappyLoader.java:317)
    at org.xerial.snappy.SnappyLoader.load(SnappyLoader.java:219)
    at org.xerial.snappy.Snappy.<clinit>(Snappy.java:44)
    at org.apache.spark.io.SnappyCompressionCodec.<init>(CompressionCodec.scala:150)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:68)
    at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:60)
    at org.apache.spark.broadcast.TorrentBroadcast.org$apache$spark$broadcast$TorrentBroadcast$$setConf(TorrentBroadcast.scala:73)
    at org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:79)
    at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
    at org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62)
    at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1077)
    at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitMissingTasks(DAGScheduler.scala:849)
    at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitStage(DAGScheduler.scala:790)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$submitStage$4.apply(DAGScheduler.scala:793)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$submitStage$4.apply(DAGScheduler.scala:792)
    at scala.collection.immutable.List.foreach(List.scala:318)
    at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitStage(DAGScheduler.scala:792)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$submitStage$4.apply(DAGScheduler.scala:793)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$submitStage$4.apply(DAGScheduler.scala:792)
    at scala.collection.immutable.List.foreach(List.scala:318)
    at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitStage(DAGScheduler.scala:792)
    at org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:774)
    at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1393)
    at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1385)
    at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
Caused by: java.lang.UnsatisfiedLinkError: no snappyjava in java.library.path
    at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1878)
    at java.lang.Runtime.loadLibrary0(Runtime.java:849)
    at java.lang.System.loadLibrary(System.java:1087)
    at org.xerial.snappy.SnappyNativeLoader.loadLibrary(SnappyNativeLoader.java:52)
    ... 33 more
java.lang.reflect.InvocationTargetException
在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处
在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)中
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)中
位于java.lang.reflect.Method.invoke(Method.java:606)
位于org.xerial.snappy.SnappyLoader.loadNativeLibrary(SnappyLoader.java:317)
位于org.xerial.snappy.SnappyLoader.load(SnappyLoader.java:219)
位于org.xerial.snappy.snappy.(snappy.java:44)
在org.apache.spark.io.SnappyCompressionCodec上(CompressionCodec.scala:150)
位于sun.reflect.NativeConstructorAccessorImpl.newInstance0(本机方法)
位于sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
在sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
位于java.lang.reflect.Constructor.newInstance(Constructor.java:526)
位于org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:68)
位于org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:60)
在org.apache.spark.broadcast.TorrentBroadcast.org$apache$spark$broadcast$TorrentBroadcast$$setConf(TorrentBroadcast.scala:73)
在org.apache.spark.broadcast.TorrentBroadcast.(TorrentBroadcast.scala:79)
位于org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
在org.apache.spark.broadcast.broadcast上(BroadcastManager.scala:62)
在org.apache.spark.SparkContext.broadcast上(SparkContext.scala:1077)
位于org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitmissingstasks(DAGScheduler.scala:849)
位于org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitStage(DAGScheduler.scala:790)
位于org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$submitStage$4.apply(DAGScheduler.scala:793)
位于org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$submitStage$4.apply(DAGScheduler.scala:792)
位于scala.collection.immutable.List.foreach(List.scala:318)
位于org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitStage(DAGScheduler.scala:792)
位于org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$submitStage$4.apply(DAGScheduler.scala:793)
位于org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$submitStage$4.apply(DAGScheduler.scala:792)
位于scala.collection.immutable.List.foreach(List.scala:318)
位于org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitStage(DAGScheduler.scala:792)
位于org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:774)
位于org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1393)
位于org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1385)
位于org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
原因:java.lang.UnsatifiedLink错误:java.library.path中没有snappyjava
位于java.lang.ClassLoader.loadLibrary(ClassLoader.java:1878)
位于java.lang.Runtime.loadLibrary0(Runtime.java:849)
位于java.lang.System.loadLibrary(System.java:1087)
位于org.xerial.snappy.SnappyNativeLoader.loadLibrary(SnappyNativeLoader.java:52)
... 33多

解决此问题需要哪些设置或更改?

处理此问题的方法是更新Intellij运行配置。将以下内容添加到JVM参数:

-Dorg.xerial.snappy.lib.name=libsnappyjava.jnilib -Dorg.xerial.snappy.tempdir=/tmp 

另一个解决方案是升级snappy的版本。虽然该问题在1.0.4.1中存在,但在1.0.5中已修复。在spark依赖项中添加排除,如

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.10</artifactId>
    <version>${spark.version}</version>
    <exclusions>
        <exclusion>
           <groupId>org.xerial.snappy</groupId>
           <artifactId>snappy-java</artifactId>
        </exclusion>
    </exclusions>
</dependency>

org.apache.spark
spark-core_2.10
${spark.version}
org.xerial.snapy
轻快的java
然后加上

<dependency>
    <groupId>org.xerial.snappy</groupId>
    <artifactId>snappy-java</artifactId>
    <version>1.0.5</version>
</dependency>

org.xerial.snapy
轻快的java
1.0.5

是为我做的。

我也经历了同样的错误。 火花芯的版本为:1.3.0-cdh5.4.3

有一次我把它改成:1.3.0 它修复了它

请注意,它是“提供”的,所以在生产中它并不重要,它只是用于开发机器

编辑: 我找到了一个更合理的解决办法。 这个问题是由OSX中Java的快速压缩错误造成的。 因此,要解决此问题,您可以将以下内容添加到pom文件中:

<dependency>
    <groupId>org.xerial.snappy</groupId>
    <artifactId>snappy-java</artifactId>
    <version>1.1.2</version>
    <type>jar</type>
    <scope>provided</scope>
</dependency>

org.xerial.snapy
轻快的java
1.1.2
罐子
假如

Spark 1.6.1的干净独立安装解决了这个问题。为了解决这个问题,我必须:

1) 手动将libsnappyjava.jnilib(在jar中)添加到java.library.path(其中包括多个位置,~/library/java/Extensions/可以)

2) 将snappy-java-1.1.2.4.jar添加到Spark的类路径(在Spark-env.sh add中
“导出SPARK_类路径=…/snappy-java-1.1.2.4.jar”

是1.0.5还是1.5.0-你的答案似乎不一致。很好的回答,@javadba,我打错了。应该是1.0.5,1.0.4.1之后的下一个版本。我在Mac中运行mahout spark itemsimilarity作业时遇到了相同的错误。以下配置对我有效。我将snappy本机库复制到/Library/Java/Extensions,然后设置env variable as export MAHOUTS_OPTS=“-Dorg.xerial.snappy.lib.path=/Library/Java/Extensions-Dorg.xerial.snappy.lib.name=libsnappyjava.jnilib-Dorg.xerial.snappy.tempdir=~/tmp”@Sujee感谢您针对mahout的反馈。我只需要在mvn测试中添加一行
-Dorg.xerial.snappy.lib.name=libsnappyjava.jnilib
。-Dorg.xerial.snappy.lib.name=libsnappyjava.jnilib@avgolubev nice。