Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala 例外初始化错误火花流卡夫卡_Scala_Apache Spark_Apache Kafka_Spark Streaming Kafka - Fatal编程技术网

Scala 例外初始化错误火花流卡夫卡

Scala 例外初始化错误火花流卡夫卡,scala,apache-spark,apache-kafka,spark-streaming-kafka,Scala,Apache Spark,Apache Kafka,Spark Streaming Kafka,我试图在一个简单的应用程序中将Spark Streaming连接到Kafka。我通过Spark文档中的示例创建了这个应用程序。当我尝试运行它时,会出现这样一个异常: Exception in thread "main" java.lang.ExceptionInInitializerError at org.apache.spark.streaming.dstream.InputDStream.<init>(InputDStream.scala:80) at org.

我试图在一个简单的应用程序中将Spark Streaming连接到Kafka。我通过Spark文档中的示例创建了这个应用程序。当我尝试运行它时,会出现这样一个异常:

Exception in thread "main" java.lang.ExceptionInInitializerError
    at org.apache.spark.streaming.dstream.InputDStream.<init>(InputDStream.scala:80)
    at org.apache.spark.streaming.kafka010.DirectKafkaInputDStream.<init>(DirectKafkaInputDStream.scala:59)
    at org.apache.spark.streaming.kafka010.KafkaUtils$.createDirectStream(KafkaUtils.scala:147)
    at org.apache.spark.streaming.kafka010.KafkaUtils$.createDirectStream(KafkaUtils.scala:124)
    at producer.KafkaProducer$.main(KafkaProducer.scala:36)
    at producer.KafkaProducer.main(KafkaProducer.scala)
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Incompatible Jackson version: 2.9.4
    at com.fasterxml.jackson.module.scala.JacksonModule$class.setupModule(JacksonModule.scala:64)
    at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:19)
    at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:751)
    at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82)
我不确定问题是在配置还是代码本身。我的build.sbt文件就是这样的:

scalaVersion := "2.11.4"

resolvers += "Spark Packages Repo" at "http://dl.bintray.com/spark-packages/maven"

libraryDependencies ++= Seq(
  "org.apache.kafka" %% "kafka" % "1.1.0",
  "org.apache.spark" %% "spark-core" % "2.3.0",
  "org.apache.spark" %% "spark-sql" % "2.3.0",
  "org.apache.spark" %% "spark-streaming" % "2.3.0",
  "org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.3.0"
)

如果有任何帮助,我将不胜感激,因为我不知道出了什么问题

通过跟踪您遇到的异常的堆栈跟踪,我们可以发现主要问题是:

原因:com.fasterxml.jackson.databind.JsonMappingException:不兼容的jackson版本:2.9.4

事实上

Spark 2.1.0包含com.fasterxml.jackson.core作为可传递依赖项。因此,我们不需要在libraryDependencies中包含then

对于类似的问题及其解决方案,将更详细地描述

scalaVersion := "2.11.4"

resolvers += "Spark Packages Repo" at "http://dl.bintray.com/spark-packages/maven"

libraryDependencies ++= Seq(
  "org.apache.kafka" %% "kafka" % "1.1.0",
  "org.apache.spark" %% "spark-core" % "2.3.0",
  "org.apache.spark" %% "spark-sql" % "2.3.0",
  "org.apache.spark" %% "spark-streaming" % "2.3.0",
  "org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.3.0"
)