用于创建GCP Publisher的Scala Spark代码抛出:java.lang.NoSuchMethodError:com.google.common.base.Premissions.checkArgument

用于创建GCP Publisher的Scala Spark代码抛出:java.lang.NoSuchMethodError:com.google.common.base.Premissions.checkArgument,scala,apache-spark,google-cloud-platform,sbt,google-cloud-pubsub,Scala,Apache Spark,Google Cloud Platform,Sbt,Google Cloud Pubsub,我正在尝试使用Spark Scala和IntelliJ将消息发布到GCP的发布/订阅中的主题。代码如下: GcpPublish.scala val publisher = Publisher.newBuilder(s"projects/projectid/topics/test") .setCredentialsProvider(FixedCredentialsProvider .create(ServiceAccountCrede

我正在尝试使用Spark Scala和IntelliJ将消息发布到GCP的发布/订阅中的主题。代码如下: GcpPublish.scala

val publisher = Publisher.newBuilder(s"projects/projectid/topics/test")
                .setCredentialsProvider(FixedCredentialsProvider
                .create(ServiceAccountCredentials
                .fromStream(new FileInputStream("gs://credsfiles/projectid.json"))))
                .build()

publisher.publish(PubsubMessage
         .newBuilder
         .setData(ByteString.copyFromUtf8(JSONData.toString()))
         .build())
这是build.sbt:

name := "TryingSomething"

version := "1.0"

scalaVersion := "2.11.12"

val sparkVersion = "2.3.2"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.3.2" % "provided",
  "org.apache.spark" %% "spark-sql" % "2.3.2" ,
  "com.google.cloud" % "google-cloud-bigquery" % "1.106.0",
  "org.apache.beam" % "beam-sdks-java-core" % "2.19.0" ,
  "org.apache.beam" % "beam-runners-google-cloud-dataflow-java" % "2.19.0",
  "com.typesafe.scala-logging" %% "scala-logging" % "3.1.0" ,
  "org.apache.beam" % "beam-sdks-java-extensions-google-cloud-platform-core" % "2.19.0" ,
  "org.apache.beam" % "beam-sdks-java-io-google-cloud-platform" % "2.19.0" ,
  "com.google.apis" % "google-api-services-bigquery" % "v2-rev456-1.25.0" ,
  "com.google.cloud" % "google-cloud-pubsub" % "1.102.1",
  "com.google.guava" % "guava" % "28.2-jre",
  "org.apache.httpcomponents" % "httpclient" % "4.5.11"
)

assemblyMergeStrategy in assembly := {
  case PathList("META-INF", xs @ _*) => MergeStrategy.discard
  case _ => MergeStrategy.first
}
但是,当我创建胖jar并在Dataprocs集群上运行它时,我得到以下错误:

Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;I)V
    at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$Builder.setPoolSize(InstantiatingGrpcChannelProvider.java:527)
    at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$Builder.setChannelsPerCpu(InstantiatingGrpcChannelProvider.java:546)
    at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$Builder.setChannelsPerCpu(InstantiatingGrpcChannelProvider.java:535)
    at com.google.cloud.pubsub.v1.Publisher$Builder.<init>(Publisher.java:633)
    at com.google.cloud.pubsub.v1.Publisher$Builder.<init>(Publisher.java:588)
    at com.google.cloud.pubsub.v1.Publisher.newBuilder(Publisher.java:584)
但这也会产生同样的错误


有什么建议会导致这个错误吗

问题在于Spark和Hadoop都注入了他们自己版本的guava,这也存在于Google Pubsub包中。我通过在build.sbt文件中添加阴影规则解决了这个问题:

assemblyShadeRules in assembly := Seq(
  ShadeRule.rename("com.google.common.**" -> "repackaged.com.google.common.@1").inAll,
  ShadeRule.rename("com.google.protobuf.**" -> "repackaged.com.google.protobuf.@1").inAll,
  ShadeRule.rename("io.netty.**" -> "repackaged.io.netty.@1").inAll
)

com.google.common和com.google.protobuf的着色规则是解决番石榴依赖关系的规则。我添加了其他项,以解决在途中遇到的后续依赖关系冲突。

是否尝试添加此dep?:
libraryDependencies+=“org.apache.httpcomponents”%”httpcore“%”4.4.13“
@RicardoSanchez是的,我现在尝试了,但没有解决问题。
assemblyShadeRules in assembly := Seq(
  ShadeRule.rename("com.google.common.**" -> "repackaged.com.google.common.@1").inAll,
  ShadeRule.rename("com.google.protobuf.**" -> "repackaged.com.google.protobuf.@1").inAll,
  ShadeRule.rename("io.netty.**" -> "repackaged.io.netty.@1").inAll
)