Apache kafka Apache Kafka:…StringDeserializer不是…反序列化程序的实例

Apache kafka Apache Kafka:…StringDeserializer不是…反序列化程序的实例,apache-kafka,kafka-consumer-api,Apache Kafka,Kafka Consumer Api,在我的简单应用程序中,我试图实例化一个KafkaConsumer,我的代码几乎是(“自动偏移提交”)的副本: @Slf4j 公共类MyKafkaConsumer{ 公共MyKafkaConsumer(){ Properties props=新属性(); put(“bootstrap.servers”,“localhost:9092”); props.put(“group.id”、“test”); props.put(“enable.auto.commit”、“true”); props.put(

在我的简单应用程序中,我试图实例化一个KafkaConsumer,我的代码几乎是(“自动偏移提交”)的副本:

@Slf4j
公共类MyKafkaConsumer{
公共MyKafkaConsumer(){
Properties props=新属性();
put(“bootstrap.servers”,“localhost:9092”);
props.put(“group.id”、“test”);
props.put(“enable.auto.commit”、“true”);
props.put(“auto.commit.interval.ms”,“1000”);
put(“key.deserializer”、“org.apache.kafka.common.serialization.StringDeserializer”);
put(“value.deserializer”、“org.apache.kafka.common.serialization.StringDeserializer”);
卡夫卡消费者=新卡夫卡消费者(道具);
consumer.subscribe(Arrays.asList(“mytopic”);
while(true){
ConsumerRecords记录=consumer.poll(100);
对于(消费者记录:记录)
log.info(record.offset()+record.key()+record.value());
//System.out.printf(“偏移量=%d,键=%s,值=%s%n”,record.offset(),record.key(),record.value());
}
}
}
如果我尝试实例化它,我会得到:

org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
        at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:781)
        at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:635)
        at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:617)
at ...MyKafkaConsumer.<init>(SikomKafkaConsumer.java:23)
    ...
    Caused by: org.apache.kafka.common.KafkaException: org.apache.kafka.common.serialization.StringDeserializer is not an instance of org.apache.kafka.common.serialization.Deserializer
        at org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:248)
        at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:680)
        ... 48 more
org.apache.kafka.common.KafkaException:无法构造kafka使用者
在org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:781)
位于org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:635)
在org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:617)
在…Mykafkanconsumer.(sikomkakafkanconsumer.java:23)
...
原因:org.apache.kafka.common.KafkaException:org.apache.kafka.common.serialization.StringDeserializer不是org.apache.kafka.common.serialization.Deserializer的实例
位于org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:248)
在org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:680)
... 48多

如何修复此问题?

不确定这是否是最终修复错误的原因,但请注意,在1.1.x kafka客户端jar中使用spring kafka测试(版本2.1.x,从版本2.1.5开始)时,您需要覆盖某些可传递的依赖项,如下所示:

<dependency>
    <groupId>org.springframework.kafka</groupId>
    <artifactId>spring-kafka</artifactId>
    <version>${spring.kafka.version}</version>
</dependency>

<dependency>
    <groupId>org.springframework.kafka</groupId>
    <artifactId>spring-kafka-test</artifactId>
    <version>${spring.kafka.version}</version>
    <scope>test</scope>
</dependency>

<dependency>
    <groupId>org.apache.kafka</groupId>
    <artifactId>kafka-clients</artifactId>
    <version>1.1.1</version>
</dependency>

<dependency>
    <groupId>org.apache.kafka</groupId>
    <artifactId>kafka-clients</artifactId>
    <version>1.1.1</version>
    <classifier>test</classifier>
</dependency>

<dependency>
    <groupId>org.apache.kafka</groupId>
    <artifactId>kafka_2.11</artifactId>
    <version>1.1.1</version>
    <scope>test</scope>
</dependency>

<dependency>
    <groupId>org.apache.kafka</groupId>
    <artifactId>kafka_2.11</artifactId>
    <version>1.1.1</version>
    <classifier>test</classifier>
    <scope>test</scope>
</dependency>

org.springframework.kafka
春天卡夫卡
${spring.kafka.version}
org.springframework.kafka
弹簧卡夫卡试验
${spring.kafka.version}
测试
org.apache.kafka
卡夫卡客户
1.1.1
org.apache.kafka
卡夫卡客户
1.1.1
测试
org.apache.kafka
卡夫卡2.11
1.1.1
测试
org.apache.kafka
卡夫卡2.11
1.1.1
测试
测试

因此,您的可传递依赖肯定有问题

这可能是Kafka类加载的问题。
将classloader设置为
null
可能会有所帮助

...
Thread currentThread = Thread.currentThread();    
ClassLoader savedClassLoader = currentThread.getContextClassLoader();

currentThread.setContextClassLoader(null);
KafkaConsumer<String, String> consumer = new KafkaConsumer<>(props);

currentThread.setContextClassLoader(savedClassLoader);
...
。。。
Thread currentThread=Thread.currentThread();
ClassLoader savedClassLoader=currentThread.getContextClassLoader();
currentThread.setContextClassLoader(null);
卡夫卡消费者=新卡夫卡消费者(道具);
currentThread.setContextClassLoader(savedClassLoader);
...
有完整的解释:

您的自定义类需要实现org.apache.kafka.common.serialization.Deserializer


你是如何解决这个问题的?我不确定,但现在运行的代码与我的问题中的代码相同,但可能是依赖性的问题:我有一个:'org.apache.kafka:kafka客户端:1.0.0'谢谢。对我来说,这也是一个过渡依赖的问题。你可以加上这个作为答案,我接受
...
Thread currentThread = Thread.currentThread();    
ClassLoader savedClassLoader = currentThread.getContextClassLoader();

currentThread.setContextClassLoader(null);
KafkaConsumer<String, String> consumer = new KafkaConsumer<>(props);

currentThread.setContextClassLoader(savedClassLoader);
...
import org.apache.kafka.common.header.Headers;
import org.apache.kafka.common.serialization.Serializer;
import org.apache.kafka.common.serialization.Deserializer;
import org.codehaus.jackson.map.ObjectMapper;

import java.io.Serializable;
import java.util.Map;

//Developed by Arun Singh
public class Employee implements Serializable, Serializer, **Deserializer** {

@Override
    public Object deserialize(String s, byte[] bytes) {
        ObjectMapper mapper = new ObjectMapper();
        Employee employee = null;
        try {
            //employee = mapper.readValue(bytes, Employee.class);
            employee = mapper.readValue(bytes.toString(), Employee.class);
        } catch (Exception e) {
            e.printStackTrace();
        }
        return employee;
    }

    @Override
    public Object deserialize(String topic, Headers headers, byte[] data) {
        ObjectMapper mapper = new ObjectMapper();
        Employee employee = null;
        try {
            //employee = mapper.readValue(bytes, Employee.class);
            employee = mapper.readValue(data.toString(), Employee.class);
        } catch (Exception e) {
            e.printStackTrace();
        }
        return employee;
    }

    public void close() {

    }
}