Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark spark gcp dataproc与番石榴的相关性错误_Apache Spark_Hadoop_Google Cloud Platform_Guava_Google Cloud Dataproc - Fatal编程技术网

Apache spark spark gcp dataproc与番石榴的相关性错误

Apache spark spark gcp dataproc与番石榴的相关性错误,apache-spark,hadoop,google-cloud-platform,guava,google-cloud-dataproc,Apache Spark,Hadoop,Google Cloud Platform,Guava,Google Cloud Dataproc,我们的项目使用gradle和scala构建spark应用程序,但我添加了gcp kms库,现在在dataproc上运行时,会出现缺少guava方法的错误: java.lang.noSuchMethodError: com.google.common.base.Preconditions.checkArgument 我按照以下指南中的建议对google库进行着色: gradle build中我的shadowJar定义: shadowJar { zip64 true relocate '

我们的项目使用gradle和scala构建spark应用程序,但我添加了gcp kms库,现在在dataproc上运行时,会出现缺少guava方法的错误:

java.lang.noSuchMethodError: com.google.common.base.Preconditions.checkArgument
我按照以下指南中的建议对google库进行着色:

gradle build中我的shadowJar定义:

shadowJar {
  zip64 true
  relocate 'com.google', 'shadow.com.google'
  relocate 'com.google.protobuf', 'shadow.com.google.protobuf'
  relocate 'google.cloud', 'shadow.google.cloud'
  exclude 'META-INF/**'
  exclude "LICENSE*"
  mergeServiceFiles()
  archiveFileName = "myjar"
}
在编译的胖jar上运行
javatf
时,这显示了在
shadow
下重新定位的guava类,包括checkArgument

但在运行dataproc spark submit时仍然会出错,而且在运行时似乎仍然选择hadoop的旧版本。下面是堆栈跟踪,从使用gcp kms decrypt的KMSsymetric类开始:

Exception in thread "main" java.lang.NoSuchMethodError: shadow.com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;CLjava/lang/Object;)V
    at io.grpc.Metadata$Key.validateName(Metadata.java:629)
    at io.grpc.Metadata$Key.<init>(Metadata.java:637)
    at io.grpc.Metadata$Key.<init>(Metadata.java:567)
    at io.grpc.Metadata$AsciiKey.<init>(Metadata.java:742)
    at io.grpc.Metadata$AsciiKey.<init>(Metadata.java:737)
    at io.grpc.Metadata$Key.of(Metadata.java:593)
    at io.grpc.Metadata$Key.of(Metadata.java:589)
    at shadow.com.google.api.gax.grpc.GrpcHeaderInterceptor.<init>(GrpcHeaderInterceptor.java:60)
    at shadow.com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:221)
    at shadow.com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:194)
    at shadow.com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:186)
    at shadow.com.google.api.gax.rpc.ClientContext.create(ClientContext.java:155)
    at shadow.com.google.cloud.kms.v1.stub.GrpcKeyManagementServiceStub.create(GrpcKeyManagementServiceStub.java:370)
    at shadow.com.google.cloud.kms.v1.stub.KeyManagementServiceStubSettings.createStub(KeyManagementServiceStubSettings.java:333)
    at shadow.com.google.cloud.kms.v1.KeyManagementServiceClient.<init>(KeyManagementServiceClient.java:155)
    at shadow.com.google.cloud.kms.v1.KeyManagementServiceClient.create(KeyManagementServiceClient.java:136)
    at shadow.com.google.cloud.kms.v1.KeyManagementServiceClient.create(KeyManagementServiceClient.java:127)
    at mycompany.my.class.path.KmsSymmetric.decrypt(KmsSymmetric.scala:31)
我使用的是dataproc映像版本1.4


我遗漏了什么?

请分享您的spark本地客户端库版本好吗?
gcloud dataproc jobs submit spark \
--cluster=${CLUSTER_NAME} \
--project ${PROJECT_ID} \
--region=${REGION} \
--jars=gs://${APP_BUCKET}/${JAR} \
--class=${CLASS} \
--app args --arg1 val1 etc