Java SerializationException连接到AVRO架构注册表

Java SerializationException连接到AVRO架构注册表,java,apache-kafka,avro,confluent-schema-registry,Java,Apache Kafka,Avro,Confluent Schema Registry,我有4个消费者,其中3个在卡夫卡客户端的0.10.0.0版本上,但有一个已移动到2.0.0版本 当我调用RestService.getId来获取我的AVRO模式的版本时,它在早期版本的三个使用者上成功,但在2.0.0版本上失败,并带有此堆栈跟踪 org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id 18 Caused by: java.net.SocketExc

我有4个消费者,其中3个在卡夫卡客户端的0.10.0.0版本上,但有一个已移动到2.0.0版本

当我调用RestService.getId来获取我的AVRO模式的版本时,它在早期版本的三个使用者上成功,但在2.0.0版本上失败,并带有此堆栈跟踪

org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id 18
Caused by: java.net.SocketException: Connection reset
    at java.net.SocketInputStream.read(SocketInputStream.java:210)
    at java.net.SocketInputStream.read(SocketInputStream.java:141)
    at sun.security.ssl.InputRecord.readFully(InputRecord.java:465)
    at sun.security.ssl.InputRecord.read(InputRecord.java:503)
    at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:983)
    at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1385)
    at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1413)
    at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1397)
    at sun.net.www.protocol.https.HttpsClient.afterConnect(HttpsClient.java:559)
    at sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.connect(AbstractDelegateHttpsURLConnection.java:185)
    at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1564)
    at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1492)
    at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:480)
    at sun.net.www.protocol.https.HttpsURLConnectionImpl.getResponseCode(HttpsURLConnectionImpl.java:347)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:185)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:229)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.getId(RestService.java:409)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.getId(RestService.java:402)
    at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getSchemaByIdFromRegistry(CachedSchemaRegistryClient.java:119)
    at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getBySubjectAndId(CachedSchemaRegistryClient.java:192)
    at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getById(CachedSchemaRegistryClient.java:168)
    at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserialize(AbstractKafkaAvroDeserializer.java:121)
    at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserialize(AbstractKafkaAvroDeserializer.java:104)
    at io.confluent.kafka.serializers.KafkaAvroDeserializer.deserialize(KafkaAvroDeserializer.java:62)
    at com.ciscospark.retention.kafkapurgelibrary.avro.deserialize.CompatibleAvroDeserializer.deserialize(CompatibleAvroDeserializer.java:48)
    at com.ciscospark.retention.kafkapurgelibrary.avro.deserialize.CompatibleAvroDeserializer.deserialize(CompatibleAvroDeserializer.java:19)
    at com.cisco.wx2.kafka.serialization.SparkKafkaDeserializer.deserialize(SparkKafkaDeserializer.java:34)
    at com.ciscospark.retention.kafkapurgelibrary.PurgeEventConsumerFactory.lambda$new$1(PurgeEventConsumerFactory.java:80)
    at com.cisco.wx2.kafka.serialization.SimpleKafkaDeserializer.deserialize(SimpleKafkaDeserializer.java:22)
    at org.apache.kafka.common.serialization.ExtendedDeserializer$Wrapper.deserialize(ExtendedDeserializer.java:65)
    at org.apache.kafka.common.serialization.ExtendedDeserializer$Wrapper.deserialize(ExtendedDeserializer.java:55)
    at org.apache.kafka.clients.consumer.internals.Fetcher.parseRecord(Fetcher.java:1009)
    at org.apache.kafka.clients.consumer.internals.Fetcher.access$3400(Fetcher.java:96)
    at org.apache.kafka.clients.consumer.internals.Fetcher$PartitionRecords.fetchRecords(Fetcher.java:1186)
    at org.apache.kafka.clients.consumer.internals.Fetcher$PartitionRecords.access$1500(Fetcher.java:1035)
    at org.apache.kafka.clients.consumer.internals.Fetcher.fetchRecords(Fetcher.java:544)
    at org.apache.kafka.clients.consumer.internals.Fetcher.fetchedRecords(Fetcher.java:505)
    at org.apache.kafka.clients.consumer.KafkaConsumer.pollForFetches(KafkaConsumer.java:1259)
    at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1187)
下面是我创建到模式注册表的连接的代码。它使用以https开头的URL

    private RestService getSchemaServiceRestService() {
        String avroSchemaRegistryUrl = this.getAvroSchemaRegistryUrl;
        RestService restService = new RestService(avroSchemaRegistryUrl);
        log.info("Avro schema registry URL {}", avroSchemaRegistryUrl);

        if (avroSchemaRegistryUrl.startsWith("https")) {
            SSLContext sslContext = null;
            try {
                sslContext = SSLContextBuilder.create()
                        .loadKeyMaterial(AvroSerializer.class.getClassLoader().getResource("avroSchemaRegistryClient.jks"),
                                avroSchemaRegistryKeyPass.toCharArray(),
                                avroSchemaRegistryKeyPass.toCharArray())
                        .loadTrustMaterial(AvroSerializer.class.getClassLoader().getResource("avroSchemaRegistryServer.jks"),
                                avroSchemaRegistryKeyPass.toCharArray())
                        .build();
            } catch (Exception e) {
                log.error("Exception when creating sslContext for schema registry client");
                throw new RuntimeException("Exception when creating sslContext for schema registry client.", e);
            }

            SSLSocketFactory factory = sslContext.getSocketFactory();
            restService.setSslSocketFactory(factory);
            log.info("Configured SSL for schema registry client");
        }

        return restService;
    }
这个函数成功了,但是当我第一次调用RestService.getId时,我得到了这个异常


有人知道如何让我的2.0.0版消费者正常工作吗?

如果有帮助,我可以在avro模式注册中心2019/11/05 16:37:54[info]17529#0:*17966136 SSL(U do)handshake()在SSL握手时失败(SSL:error:140760FC:SSL例程:SSL23(U get)U CLIENT(U HELLO:未知协议),客户端:52.14.5.177,服务器:0.0.0.0:8082我不太明白您为什么需要编写自己的反序列化程序,可能需要升级客户端,因为我认为Confluent 5.0适用于Kafka 2.0,正如我提到的,我会确保您使用的是基于合流发行说明的匹配客户端版本,并且客户端与服务器版本匹配。我们服务的一个版本是将TLSv1硬编码到其启动配置中的,这与预期为TLSv1.2的out schema registry版本不兼容。我们移除了重新部署的硬编码TLSv1,它现在可以工作了。谢谢板球