Scala Spark 2.X抛出io.netty.buffer.POOLEDBYTEBUFFALLOCATOR.metric()Lio/netty/buffer/POOLEDBYTEBUFFALLOCATORMETRIC

Scala Spark 2.X抛出io.netty.buffer.POOLEDBYTEBUFFALLOCATOR.metric()Lio/netty/buffer/POOLEDBYTEBUFFALLOCATORMETRIC,scala,apache-spark,Scala,Apache Spark,我在尝试运行Spark时一直看到这个错误,我曾尝试在2.3.2、2.3.3和2.4.3中使用Spark java.lang.NoSuchMethodError: metric()Lio/netty/buffer/PooledByteBufAllocator; 在 org.apache.spark.network.util.nettymetrics.registerMetrics(nettymetrics.java:80) 在 org.apache.spark.network.util.netty

我在尝试运行Spark时一直看到这个错误,我曾尝试在2.3.2、2.3.3和2.4.3中使用Spark

java.lang.NoSuchMethodError: metric()Lio/netty/buffer/PooledByteBufAllocator; 在 org.apache.spark.network.util.nettymetrics.registerMetrics(nettymetrics.java:80) 在 org.apache.spark.network.util.nettymetrics.(nettymetrics.java:76) 在 org.apache.spark.network.client.TransportClientFactory.(TransportClientFactory.java:109) 在 org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:99)

在该块的最后一行调用:

lazy val spark: SparkSession = {
    SparkSession
      .builder()
      .appName("SparkProfiler")
      .master("local[*]").config("spark.driver.host", "localhost")
      .getOrCreate()
  }
我尝试过类似线索暗示的建议,但没有效果

例如,在my build.sbt中,我有这些依赖项覆盖

dependencyOverrides += "io.netty" % "netty" % "3.9.9.Final" // have tried not including this
dependencyOverrides += "io.netty" % "netty-all" % "4.1.8.Final"
dependencyOverrides += "io.netty" % "netty-buffer" % "3.9.9.Final" // have tried keeping this version to 4.1.8Final
dependencyOverrides += "io.netty" % "netty-codec" % "4.1.8.Final"
dependencyOverrides += "io.netty" % "netty-codec-http" % "4.1.8.Final"
dependencyOverrides += "io.netty" % "netty-common" % "4.1.8.Final"
dependencyOverrides += "io.netty" % "netty-handler" % "4.1.8.Final"
dependencyOverrides += "io.netty" % "netty-resolver" % "4.1.8.Final"
dependencyOverrides += "io.netty" % "netty-transport" % "4.1.8.Final"
当我查看外部库时,我确实看到:

sbt: io.netty:netty:3.9.9.Final.jar
sbt: io.netty:netty-all:4.1.8.Final.jar
但我也尝试在build.sbt中包括:

excludeDependencies++=excludeRule(“io.netty”、“netty”)

因此,
sbt:io.netty:netty:3.9.9.Final.jar
被排除在我的外部库之外


当我通过IntelliJ探索错误并进入导入的
NettyMemoryMetrics
类时,我确实在导入中看到了
import io.netty.buffer.pooledByteByteBayAllocatorMetric。我认为这可以通过在依赖项中保留netty来解决,但是似乎无法为Spark找到合适的组合来在构建之后找到这个类。有什么建议吗?

终于在这条评论中找到了答案:

添加到build.sbt中:我所有的netty版本都需要是4.1.17,而不是4.1.8,除了简单的“netty”

dependencyOverrides += "io.netty" % "netty" % "3.9.9.Final"
dependencyOverrides += "io.netty" % "netty-all" % "4.1.17.Final"
dependencyOverrides += "io.netty" % "netty-buffer" % "4.1.17.Final"
dependencyOverrides += "io.netty" % "netty-codec" % "4.1.17.Final"
dependencyOverrides += "io.netty" % "netty-codec-http" % "4.1.17.Final"
dependencyOverrides += "io.netty" % "netty-common" % "4.1.17.Final"
dependencyOverrides += "io.netty" % "netty-handler" % "4.1.17.Final"
dependencyOverrides += "io.netty" % "netty-resolver" % "4.1.17.Final"
dependencyOverrides += "io.netty" % "netty-transport" % "4.1.17.Final"