Java Avro反序列化错误-ArrayIndexOutOfBoundsException 我的工作流程是——我创建AVSC文件,使用AVReQueNpp工具生成C++类,在我的C++应用程序中创建AVRO二进制编码数据。

Java Avro反序列化错误-ArrayIndexOutOfBoundsException 我的工作流程是——我创建AVSC文件,使用AVReQueNpp工具生成C++类,在我的C++应用程序中创建AVRO二进制编码数据。,java,c++,deserialization,avro,Java,C++,Deserialization,Avro,我试图找出场景2不起作用的原因 情景1 test.avsc 编码器-C++ 编码器C++ 组件_deviceinfodeviceinfo; deviceInfo.deviceId.set_字符串(“device1”); deviceInfo.zoneId.set_字符串(“zone1”); std::向量远程字节; 组件\事件列表组件事件列表; ComponentEventList.deviceInfo.set\u Component\u deviceInfo(deviceInfo); std:

我试图找出场景2不起作用的原因

情景1

test.avsc

编码器-C++

编码器C++

组件_deviceinfodeviceinfo;
deviceInfo.deviceId.set_字符串(“device1”);
deviceInfo.zoneId.set_字符串(“zone1”);
std::向量远程字节;
组件\事件列表组件事件列表;
ComponentEventList.deviceInfo.set\u Component\u deviceInfo(deviceInfo);
std::auto_ptr out=avro::memoryOutputStream(1);
avro::EncoderRPTREnc=avro::binaryEncoder();
enc->init(*out);
avro::encode(*enc,ComponentEventList);
out->flush();
size\u t byte\u count=out->byteCount();

DBG(“字节计数”如果您将DeviceInfo定义为内联而不是avsc中的数组元素,它是否有效?

我相信您所做的是在第二个模式中定义一个并集

你可能想看看这个,老实说,我觉得这没什么帮助


问题是我使用错误的类创建了Avro编码的数据。但更大的问题是顶级联合-

我也得到了它,可能是因为

作为解决办法,我做了:

    public static SpecificRecord deserializeAvroUsingFile(byte [] bytes, String napId, Schema schema) throws IOException {
        if (shouldCompress) {
            bytes = Snappy.uncompress(bytes);
        }
        File file = File.createTempFile("avro",null);
        FileOutputStream fos = new FileOutputStream(file);
        fos.write(bytes);
        fos.close();
        SpecificDatumReader<SpecificRecord> datumReader = new SpecificDatumReader<>(schema); // new GenericDatumReader();
        DataFileReader reader = new DataFileReader(file, datumReader);
        while (reader.hasNext()) {
            Object obj = reader.next(null);
            return (SpecificRecord) obj;
        }
        throw new IllegalStateException();
    }
publicstaticspecifirecord反序列化avrousingfile(byte[]bytes,stringnapid,Schema Schema)引发IOException{
如果(应该压缩){
字节=Snappy.uncompress(字节);
}
File File=File.createTempFile(“avro”,null);
FileOutputStream fos=新的FileOutputStream(文件);
fos.写入(字节);
fos.close();
SpecificDatumReader datumReader=新的SpecificDatumReader(架构);//新的GenericDatumReader();
DataFileReader=新的DataFileReader(文件,datumReader);
while(reader.hasNext()){
Object obj=reader.next(空);
返回(特定记录)obj;
}
抛出新的非法状态异常();
}

Hi@JohnM,是的,当我内联定义DeviceInfo时,它是有效的!但是,问题是我负担不起所有架构定义的内联。关于导致此问题的原因,有任何线索吗?Hi Karthik,你解决了吗?任何输入都可能有帮助。我现在面临同样的问题。是的。请查看我接受的答案-特别是
Component_DeviceInfo deviceInfo;
    deviceInfo.deviceId.set_string("device1");
    deviceInfo.zoneId.set_string("zone1");
    std::vector <char>tele_bytes_;
    std::auto_ptr<avro::OutputStream> out = avro::memoryOutputStream(1);
    avro::EncoderPtr enc = avro::binaryEncoder();
    enc->init(*out);
    avro::encode(*enc, deviceInfo);
    out->flush();

    size_t byte_count = out->byteCount();
    DBG("BYTE COUNT " << byte_count);

    std::auto_ptr<avro::InputStream> in = avro::memoryInputStream(*out);
    avro::StreamReader reader(*in);
    std::vector<uint8_t> row_data(byte_count);
    reader.readBytes(&row_data[0], byte_count);
@Override
    public Object deserializeByteArr(Schema schema, final byte[] data){
        DatumReader<GenericRecord> genericDatumReader = new SpecificDatumReader<>(schema);
        Decoder decoder = DecoderFactory.get().binaryDecoder(data, null);
        try {
            GenericRecord userData = genericDatumReader.read(null, decoder);
            System.out.println(userData);
        } catch (IOException e) {
            e.printStackTrace();
        }
        return null;
    }
[
    {
        "namespace": "com.company.project",
        "name": "Component_DeviceInfo",
        "type": "record",
        "doc": "Identifies a client device",
        "fields": [
            {
                "name": "deviceId",
                "type": [
                    "null",
                    "string"
                ],
                "default": null,
                "doc": "Unique MAC address"
            },
            {
                "name": "zoneId",
                "type": [
                    "null",
                    "string"
                ],
                "default": null,
                "doc": "Zone id where Client device belongs to"
            }
        ]
    },
    {
        "namespace": "com.company.project",
        "name": "Component_EventList",
        "type": "record",
        "doc": "Component Event list",
        "fields": [
            {
                "name": "deviceInfo",
                "type": [
                    "null",
                    "com.company.project.Component_DeviceInfo"
                ],
                "default": null,
                "doc": "Device information such as device id and zone id"
            }
        ]
    }
]
Component_DeviceInfo deviceInfo;
    deviceInfo.deviceId.set_string("device1");
    deviceInfo.zoneId.set_string("zone1");

    std::vector <char>tele_bytes_;

    Component_EventList ComponentEventList;
    ComponentEventList.deviceInfo.set_Component_DeviceInfo(deviceInfo);

    std::auto_ptr<avro::OutputStream> out = avro::memoryOutputStream(1);
    avro::EncoderPtr enc = avro::binaryEncoder();
    enc->init(*out);
    avro::encode(*enc, ComponentEventList);
    out->flush();

    size_t byte_count = out->byteCount();
    DBG("BYTE COUNT " << byte_count);

    std::auto_ptr<avro::InputStream> in = avro::memoryInputStream(*out);
    avro::StreamReader reader(*in);
    std::vector<uint8_t> row_data(byte_count);
    reader.readBytes(&row_data[0], byte_count);
org.springframework.kafka.listener.ListenerExecutionFailedException: Listener method 'public void com.company.telemetry.services.consumer.TelemetryConsumerService.consume(org.apache.kafka.clients.consumer.ConsumerRecord<java.lang.String, byte[]>)' threw exception; nested exception is java.lang.ArrayIndexOutOfBoundsException: 7
    at org.springframework.kafka.listener.adapter.MessagingMessageListenerAdapter.invokeHandler(MessagingMessageListenerAdapter.java:188) ~[spring-kafka-1.1.6.RELEASE.jar:na]
    at org.springframework.kafka.listener.adapter.RecordMessagingMessageListenerAdapter.onMessage(RecordMessagingMessageListenerAdapter.java:72) ~[spring-kafka-1.1.6.RELEASE.jar:na]
    at org.springframework.kafka.listener.adapter.RecordMessagingMessageListenerAdapter.onMessage(RecordMessagingMessageListenerAdapter.java:47) ~[spring-kafka-1.1.6.RELEASE.jar:na]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeRecordListener(KafkaMessageListenerContainer.java:794) [spring-kafka-1.1.6.RELEASE.jar:na]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeListener(KafkaMessageListenerContainer.java:738) [spring-kafka-1.1.6.RELEASE.jar:na]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:570) [spring-kafka-1.1.6.RELEASE.jar:na]
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_91]
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_91]
    at java.lang.Thread.run(Thread.java:745) [na:1.8.0_91]
Caused by: java.lang.ArrayIndexOutOfBoundsException: 7
    at org.apache.avro.io.parsing.Symbol$Alternative.getSymbol(Symbol.java:402) ~[avro-1.7.7.jar:1.7.7]
    at org.apache.avro.io.ResolvingDecoder.doAction(ResolvingDecoder.java:290) ~[avro-1.7.7.jar:1.7.7]
    at org.apache.avro.io.parsing.Parser.advance(Parser.java:88) ~[avro-1.7.7.jar:1.7.7]
    at org.apache.avro.io.ResolvingDecoder.readIndex(ResolvingDecoder.java:267) ~[avro-1.7.7.jar:1.7.7]
    at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:155) ~[avro-1.7.7.jar:1.7.7]
    at org.apache.avro.generic.GenericDatumReader.readField(GenericDatumReader.java:193) ~[avro-1.7.7.jar:1.7.7]
    at org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:183) ~[avro-1.7.7.jar:1.7.7]
    at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:151) ~[avro-1.7.7.jar:1.7.7]
    at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:155) ~[avro-1.7.7.jar:1.7.7]
    at org.apache.avro.generic.GenericDatumReader.readField(GenericDatumReader.java:193) ~[avro-1.7.7.jar:1.7.7]
    at org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:183) ~[avro-1.7.7.jar:1.7.7]
    at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:151) ~[avro-1.7.7.jar:1.7.7]
    at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:155) ~[avro-1.7.7.jar:1.7.7]
    at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:142) ~[avro-1.7.7.jar:1.7.7]
    at com.company.telemetry.services.serde.AvroByteArrDeserializer.deserializeByteArr(AvroByteArrDeserializer.java:32) ~[classes/:na]
    at com.company.telemetry.services.TelemetryService.handleByteArr(TelemetryService.java:59) ~[classes/:na]
    at com.company.telemetry.services.consumer.TelemetryConsumerService.consume(TelemetryConsumerService.java:39) ~[classes/:na]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_91]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_91]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_91]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_91]
    at org.springframework.messaging.handler.invocation.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:180) ~[spring-messaging-4.3.11.RELEASE.jar:4.3.11.RELEASE]
    at org.springframework.messaging.handler.invocation.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java:112) ~[spring-messaging-4.3.11.RELEASE.jar:4.3.11.RELEASE]
    at org.springframework.kafka.listener.adapter.HandlerAdapter.invoke(HandlerAdapter.java:48) ~[spring-kafka-1.1.6.RELEASE.jar:na]
    at org.springframework.kafka.listener.adapter.MessagingMessageListenerAdapter.invokeHandler(MessagingMessageListenerAdapter.java:174) ~[spring-kafka-1.1.6.RELEASE.jar:na]
    ... 8 common frames omitted
    public static SpecificRecord deserializeAvroUsingFile(byte [] bytes, String napId, Schema schema) throws IOException {
        if (shouldCompress) {
            bytes = Snappy.uncompress(bytes);
        }
        File file = File.createTempFile("avro",null);
        FileOutputStream fos = new FileOutputStream(file);
        fos.write(bytes);
        fos.close();
        SpecificDatumReader<SpecificRecord> datumReader = new SpecificDatumReader<>(schema); // new GenericDatumReader();
        DataFileReader reader = new DataFileReader(file, datumReader);
        while (reader.hasNext()) {
            Object obj = reader.next(null);
            return (SpecificRecord) obj;
        }
        throw new IllegalStateException();
    }