Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala 原因:com.fasterxml.jackson.databind.JsonMappingException:不兼容的jackson版本:2.8.9_Scala_Apache Spark_<img Src="//i.stack.imgur.com/RUiNP.png" Height="16" Width="18" Alt="" Class="sponsor Tag Img">elasticsearch_Spark Dataframe - Fatal编程技术网 elasticsearch,spark-dataframe,Scala,Apache Spark,elasticsearch,Spark Dataframe" /> elasticsearch,spark-dataframe,Scala,Apache Spark,elasticsearch,Spark Dataframe" />

Scala 原因:com.fasterxml.jackson.databind.JsonMappingException:不兼容的jackson版本:2.8.9

Scala 原因:com.fasterxml.jackson.databind.JsonMappingException:不兼容的jackson版本:2.8.9,scala,apache-spark,elasticsearch,spark-dataframe,Scala,Apache Spark,elasticsearch,Spark Dataframe,当我执行df.show()以打印数据帧行的内容时,会出现以下错误: Caused by: com.fasterxml.jackson.databind.JsonMappingException: Incompatible Jackson version: 2.8.9 at com.fasterxml.jackson.module.scala.JacksonModule$class.setupModule(JacksonModule.scala:64) at com.fasterx

当我执行
df.show()
以打印数据帧行的内容时,会出现以下错误:

Caused by: com.fasterxml.jackson.databind.JsonMappingException: Incompatible Jackson version: 2.8.9
    at com.fasterxml.jackson.module.scala.JacksonModule$class.setupModule(JacksonModule.scala:64)
    at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:19)
    at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:747)
    at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82)
    at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala)
我在
build.sbt
中不使用Jackson库

更新:

import sbtassembly.AssemblyPlugin.autoImport.assemblyOption
name := "test"
lazy val spark = "org.apache.spark"
lazy val typesafe = "com.typesafe.akka"
val sparkVersion = "2.2.0"
val elasticSparkVersion = "6.2.4"
val scalaLoggingVersion = "3.7.2"
val slf4jVersion = "1.7.5"
val kafkaVersion = "0.8.0.0"
val akkaVersion = "2.5.9"
val playVersion = "2.6.8"
val sprayVersion = "1.3.2"
val opRabbitVersion = "2.1.0"
val orientdbVersion = "2.2.34"
val livyVersion = "0.5.0-incubating"
val scalaHttpVersion = "2.3.0"
val scoptVersion = "3.3.0"
resolvers ++= Seq(
  // repo for op-rabbit client
  "SpinGo OSS" at "http://spingo-oss.s3.amazonaws.com/repositories/releases",
  "SparkPackagesRepo" at "http://dl.bintray.com/spark-packages/maven",
  "cloudera.repo" at "https://repository.cloudera.com/artifactory/cloudera-repos"
)
lazy val commonSettings = Seq(
  organization := "org.test",
  version := "0.1",
  scalaVersion := "2.11.8",
  assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = true),
  assemblyMergeStrategy in assembly := {
    case PathList("META-INF", xs @ _*) => MergeStrategy.discard
    case PathList("reference.conf") => MergeStrategy.concat
    case x => MergeStrategy.first
  }
)
val sparkSQL = spark %% "spark-sql" % sparkVersion
val sparkGraphx = spark %% "spark-graphx" % sparkVersion
val sparkMLLib = spark %% "spark-mllib" % sparkVersion
val elasticSpark = "org.elasticsearch" % "elasticsearch-hadoop" % elasticSparkVersion
val livyAPI = "org.apache.livy" % "livy-api" % livyVersion
val livyScalaAPI = "org.apache.livy" %% "livy-scala-api" % livyVersion
val livyClientHttp = "org.apache.livy" % "livy-client-http" % livyVersion
val spingoCore = "com.spingo" %% "op-rabbit-core" % opRabbitVersion
val spingoPlayJson = "com.spingo" %% "op-rabbit-play-json" % opRabbitVersion
val spingoJson4s = "com.spingo" %% "op-rabbit-json4s" % opRabbitVersion
val spingoAirbrake = "com.spingo" %% "op-rabbit-airbrake" % opRabbitVersion
val spingoAkkaStream = "com.spingo" %% "op-rabbit-akka-stream" % opRabbitVersion
val orientDB = "com.orientechnologies" % "orientdb-graphdb" % orientdbVersion excludeAll(
  ExclusionRule("commons-beanutils", "commons-beanutils-core"),
  ExclusionRule("commons-collections", "commons-collections"),
  ExclusionRule("commons-logging", "commons-logging"),
  ExclusionRule("stax", "stax-api")
)
val scopt = "com.github.scopt" %% "scopt" % scoptVersion
val spray = "io.spray" %% "spray-json" % sprayVersion
val scalaHttp = "org.scalaj" %% "scalaj-http" % scalaHttpVersion

lazy val graph = (project in file("./app"))
  .settings(
    commonSettings,
    libraryDependencies ++= Seq(sparkSQL, sparkGraphx, sparkMLLib, orientDB,
                                livyAPI, livyScalaAPI, livyClientHttp, scopt,
                                spingoCore, scalaHttp,
                                spray, spingoCore, spingoPlayJson, spingoJson4s,
                                spingoAirbrake, spingoAkkaStream, elasticSpark)
  )
dependencyOverrides += "com.typesafe.akka" %% "akka-stream" % akkaVersion
我尝试为Spark添加Jackson库,但没有解决问题:

val jacksonCore = "com.fasterxml.jackson.core" % "jackson-core" % "2.6.5"
val jacksonDatabind = "com.fasterxml.jackson.core" % "jackson-databind" % "2.6.5"
val jacksonAnnotations = "com.fasterxml.jackson.core" %% "jackson-annotations" % "2.6.5"
val jacksonScala = "com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.6.5"
最后,我这样做了(由于某种原因,最后两个依赖项无法解决):

但现在我得到了一个错误:

Exception in thread "main" java.lang.NoClassDefFoundError: com/fasterxml/jackson/module/scala/DefaultScalaModule$
Caused by: java.lang.ClassNotFoundException: com.fasterxml.jackson.module.scala.DefaultScalaModule$

Spark 2.2.0的jackson版本是2.6.5,看起来您的另一个依赖项正在使用jackson 2.8.9,这两个版本不兼容,因此您需要将它们与相同的jackson版本对齐

这个build.sbt看起来很有问题,因为您混合了很多可能与jackson和其他依赖项不一致的内容

例如op-rabbit-json4s期望jackson是3.5.3,另一方面,我认为orientdb graphdb期望jackson的第三个版本(2.2.3)

总之,您需要尽可能多地调整依赖项,以确保没有冲突


在这里,您可以找到一个有用的插件来检查Spark 2.2.0的jackson版本是否为2.6.5,您的另一个依赖项似乎正在使用jackson 2.8.9,这两个版本不兼容,因此您需要将它们与相同的jackson版本对齐

这个build.sbt看起来很有问题,因为您混合了很多可能与jackson和其他依赖项不一致的内容

例如op-rabbit-json4s期望jackson是3.5.3,另一方面,我认为orientdb graphdb期望jackson的第三个版本(2.2.3)

总之,您需要尽可能多地调整依赖项,以确保没有冲突


在这里,您可以找到一个有用的插件来检查相关性

您使用的Spark版本是什么?您正在使用Scala 2.11吗?你能分享你的build.sbt文件吗?@Mikel:请看我的更新。我使用Spark 2.2.0、Scala 2.11.8和Elastic Hadoop 6.2.4。如何构建Spark应用程序?我注意到你使用sbt组装插件。你如何运行这个应用程序?是否可以
dependencyOverrides
帮助,如中所述?@JacekLaskowski:请查看我的更新。我还尝试了你使用的Spark的哪个版本?您正在使用Scala 2.11吗?你能分享你的build.sbt文件吗?@Mikel:请看我的更新。我使用Spark 2.2.0、Scala 2.11.8和Elastic Hadoop 6.2.4。如何构建Spark应用程序?我注意到你使用sbt组装插件。你如何运行这个应用程序?是否可以
dependencyOverrides
帮助,如中所述?@JacekLaskowski:请查看我的更新。我也试过了
dependencyOverrides += "com.typesafe.akka" %% "akka-stream" % akkaVersion
dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-core" % "2.8.9"
dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-databind" % "2.8.9"
dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-annotations" % "2.8.9"
dependencyOverrides += "com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.8.9"
dependencyOverrides += "com.fasterxml.jackson.module" % "jackson-module-paranamer" % "2.8.9"
Exception in thread "main" java.lang.NoClassDefFoundError: com/fasterxml/jackson/module/scala/DefaultScalaModule$
Caused by: java.lang.ClassNotFoundException: com.fasterxml.jackson.module.scala.DefaultScalaModule$