Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/scala/16.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
java.lang.NoClassDefFoundError:scala/collection/GenTraversableOnce_Scala_Maven_Apache Spark_Dependencies_Apache Kafka - Fatal编程技术网

java.lang.NoClassDefFoundError:scala/collection/GenTraversableOnce

java.lang.NoClassDefFoundError:scala/collection/GenTraversableOnce,scala,maven,apache-spark,dependencies,apache-kafka,Scala,Maven,Apache Spark,Dependencies,Apache Kafka,我正在尝试使用卡夫卡运行spark streaming。我使用的是Scala版本2.11.8和Spark 2.1.0,它们是基于Scala 2.11.8构建的。我知道问题在于scala版本不匹配,但所有依赖项都添加了正确的版本(见附件pic),我仍然收到了这个错误 Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class at kafka.uti

我正在尝试使用卡夫卡运行spark streaming。我使用的是Scala版本2.11.8和Spark 2.1.0,它们是基于Scala 2.11.8构建的。我知道问题在于scala版本不匹配,但所有依赖项都添加了正确的版本(见附件pic),我仍然收到了这个错误

Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class
    at kafka.utils.Pool.<init>(Unknown Source)
    at kafka.consumer.FetchRequestAndResponseStatsRegistry$.<init>(Unknown Source)
    at kafka.consumer.FetchRequestAndResponseStatsRegistry$.<clinit>(Unknown Source)
    at kafka.consumer.SimpleConsumer.<init>(Unknown Source)
    at org.apache.spark.streaming.kafka.KafkaCluster.connect(KafkaCluster.scala:59)
    at org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$org$apache$spark$streaming$kafka$KafkaCluster$$withBrokers$1.apply(KafkaCluster.scala:364)
    at org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$org$apache$spark$streaming$kafka$KafkaCluster$$withBrokers$1.apply(KafkaCluster.scala:361)
    at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
    at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:35)
    at org.apache.spark.streaming.kafka.KafkaCluster.org$apache$spark$streaming$kafka$KafkaCluster$$withBrokers(KafkaCluster.scala:361)
    at org.apache.spark.streaming.kafka.KafkaCluster.getPartitionMetadata(KafkaCluster.scala:132)
    at org.apache.spark.streaming.kafka.KafkaCluster.getPartitions(KafkaCluster.scala:119)
    at org.apache.spark.streaming.kafka.KafkaUtils$.getFromOffsets(KafkaUtils.scala:211)
    at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:484)
    at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:607)
    at com.forrester.streaming.kafka.App$.main(App.scala:19)
    at com.forrester.streaming.kafka.App.main(App.scala)
错误消息ClassNotFoundException:scala.collection.GenTraversableOnce$class


案例1正在工作,但案例5失败,不应引发任何错误

您使用的构建工具是什么?mvn还是sbt?检查您的“org.scala lang”依赖项版本。我正在使用maven版本3.5.0。那么,您的POM文件看起来如何?查看此链接并适当调整POM.xml,然后重试或将POM.xml内容发布到此处。错误的第一行表明这是由于Scala版本不兼容造成的,因此您可能应该再次检查POM文件中的“org.Scala lang”版本
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.8</version>
<scope>provided</scope>
</dependency>

<dependency>
<groupId>com.koverse</groupId>
<artifactId>koverse-shaded-deps</artifactId>
<version>${koverse.version}</version>
<scope>provided</scope>
</dependency>

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.11</artifactId>
<version>2.1.0</version>
<exclusions>
<exclusion>
<groupId>*</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>


<dependency>
<groupId>org.scalanlp</groupId>
<artifactId>breeze_2.11</artifactId>
<version>0.11.2</version>
</dependency>

<dependency>
<groupId>org.xerial.snappy</groupId>
<artifactId>snappy-java</artifactId>
<version>1.0.5</version>
</dependency>


<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>2.1.0</version>
<scope>test</scope>
</dependency>


<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-8-assembly_2.11</artifactId>
<version>2.1.0</version>
</dependency>

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.1.0</version>
</dependency>
</dependencies>
|Spark build on Scala | Kafka jar                                           |       Result  |
| ------------------- | --------------------------------------------------- | -------------- |
| 2.1.1  on 2.11.8    | spark-streaming-kafka-0-8-assembly_2.11-2.1.1.jar   |   **Working** |
| 2.1.1 on 2.11.8     | spark-streaming-kafka-0-8-assembly_2.10-2.1.1.jar   |   Error as Expected     |
| 2.1.1 on 2.11.8     | spark-streaming-kafka-0-8-assembly_2.10-2.1.0.jar   |   Error as Expected | 
| 2.1.0 on 2.11.8     | spark-streaming-kafka-0-8-assembly_2.10-2.1.0.jar   |   Error as Expected |
| 2.1.0 on 2.11.8     | spark-streaming-kafka-0-8-assembly_2.11-2.1.0.jar   |   **Error : ideally should pass** |
| 2.1.0 on 2.11.8     | spark-streaming-kafka-0-8-assembly_2.11-2.1.1.jar   |   Error as Expected |
| 2.1.0 on 2.11.8     | spark-streaming-kafka-0-8-assembly_2.10-2.1.0.jar   |   Error as Expected |