spark提交openrtb java.lang.VerifyError协议

spark提交openrtb java.lang.VerifyError协议,java,apache-spark,protocol-buffers,Java,Apache Spark,Protocol Buffers,我在s3上压缩了文件,文件包含编码的Base64字符串,表示protobuf消息的字节数组。 Proto方案看起来像: syntax = "proto2"; package com.myproject.proto; option java_outer_classname = "MyProtos"; import "openrtb.proto"; message Request { optional int64 timestamp = 1; optional com.google

我在s3上压缩了文件,文件包含编码的Base64字符串,表示protobuf消息的字节数组。 Proto方案看起来像:

syntax = "proto2";
package com.myproject.proto;
option java_outer_classname = "MyProtos";
import "openrtb.proto";

message Request {
    optional int64 timestamp = 1;
    optional com.google.openrtb.BidRequest bidRequest = 2;
    optional string otherData = 3;
}
当我从本地运行flatMap函数的下一个spark代码时:

public static Iterator<MyProtos.Request> parseRequest(String source) {
    try {
        byte[] bytes = Base64.decodeBase64(source);
        MyProtos.Request request = MyProtos.Request.parseFrom(bytes);
        return Collections.singletonList(request).iterator();
    } catch (Exception e) {
        return Collections.emptyIterator();
    }
}

问题出在运行时环境变量
spark.executor.userClassPathFirst中,默认情况下该变量等于false。

如果在客户端模式下远程或本地运行spark,则不存在此类依赖项冲突问题。

在我的案例中,问题是应用程序是使用protobuf 3.5.0构建的,但spark在jars目录中有2.5.0。简单的解决方法是将新的3.5.0 jar放入

java.lang.VerifyError: Bad type on operand stack
Exception Details:
  Location:
    com/google/protobuf/GeneratedMessageV3$ExtendableMessage.hasExtension(Lcom/google/protobuf/GeneratedMessage$GeneratedExtension;)Z @2: invokevirtual
  Reason:
    Type 'com/google/protobuf/GeneratedMessage$GeneratedExtension' (current frame, stack[1]) is not assignable to 'com/google/protobuf/ExtensionLite'
  Current Frame:
    bci: @2
    flags: { }
    locals: { 'com/google/protobuf/GeneratedMessageV3$ExtendableMessage', 'com/google/protobuf/GeneratedMessage$GeneratedExtension' }
    stack: { 'com/google/protobuf/GeneratedMessageV3$ExtendableMessage', 'com/google/protobuf/GeneratedMessage$GeneratedExtension' }
  Bytecode:
    0x0000000: 2a2b b600 21ac