Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache kafka 如何克服kafka.consumer.ConsumerTimeoutException?_Apache Kafka_Kafka Consumer Api_Kafka Producer Api - Fatal编程技术网

Apache kafka 如何克服kafka.consumer.ConsumerTimeoutException?

Apache kafka 如何克服kafka.consumer.ConsumerTimeoutException?,apache-kafka,kafka-consumer-api,kafka-producer-api,Apache Kafka,Kafka Consumer Api,Kafka Producer Api,我正在使用卡夫卡2.11版本来编写一个消费者。我不断收到超时异常。我不确定我在这里使用了正确的API 有人能帮我吗 执行人 import kafka.consumer.Consumer; import kafka.consumer.ConsumerConfig; import kafka.consumer.KafkaStream; import kafka.javaapi.consumer.ConsumerConnector; public class MessageListe

我正在使用卡夫卡2.11版本来编写一个消费者。我不断收到超时异常。我不确定我在这里使用了正确的API

有人能帮我吗

执行人

import kafka.consumer.Consumer;
import kafka.consumer.ConsumerConfig;
import kafka.consumer.KafkaStream;
import kafka.javaapi.consumer.ConsumerConnector;

        public class MessageListener {
            private Properties properties;

            private ConsumerConnector consumerConnector;
            private String topic;
            private ExecutorService executor;

            public MessageListener(String topic) {
                this.topic = topic;

                KafkaConfigurationLoader confLoader = new KafkaConfigurationLoader();
                try {
                    properties = confLoader.loadConsumerConfig();
                    ConsumerConfig consumerConfig = new ConsumerConfig(properties);
                    consumerConnector = Consumer.createJavaConsumerConnector(consumerConfig);
                } catch (FileNotFoundException e) {
                    e.printStackTrace();
                }
            }

            public void start(File file) {

                Map<String, Integer> topicCountMap = new HashMap<>();
                topicCountMap.put(topic, new Integer(CoreConstants.THREAD_SIZE));

                Map<String, List<KafkaStream<byte[], byte[]>>> consumerMap = consumerConnector
                        .createMessageStreams(topicCountMap);
                List<KafkaStream<byte[], byte[]>> streams = consumerMap.get(topic);
                executor = Executors.newFixedThreadPool(CoreConstants.THREAD_SIZE);

                for (KafkaStream<byte[], byte[]> stream : streams) {
                    executor.submit(new ListenerThread(stream));

                }
            }


        }
导入kafka.consumer.consumer;
导入kafka.consumer.ConsumerConfig;
进口kafka.consumer.KafkaStream;
导入kafka.javaapi.consumer.ConsumerConnector;
公共类MessageListener{
私人物业;;
私人用户连接器用户连接器;
私有字符串主题;
私人遗嘱执行人;
公共消息侦听器(字符串主题){
this.topic=主题;
kafkanconfigurationloader confLoader=新的kafkanconfigurationloader();
试一试{
properties=confLoader.loadConsumerConfig();
ConsumerConfig ConsumerConfig=新的ConsumerConfig(属性);
consumerConnector=Consumer.createJavaConsumerConnector(consumerConfig);
}catch(filenotfounde异常){
e、 printStackTrace();
}
}
公共无效开始(文件){
Map topicCountMap=新HashMap();
topicCountMap.put(主题,新整数(CoreConstants.THREAD_SIZE));
映射consumerMap=consumerConnector
.createMessageStreams(topicCountMap);
列表流=consumerMap.get(主题);
executor=Executors.newFixedThreadPool(CoreConstants.THREAD\u SIZE);
用于(卡夫卡斯特林流:流){
提交(新ListenerThread(stream));
}
}
}
线程

import kafka.consumer.ConsumerIterator;
import kafka.consumer.ConsumerTimeoutException;
import kafka.consumer.KafkaStream;
import kafka.message.MessageAndMetadata;
public class ListenerThread implements Runnable {
    private KafkaStream<byte[], byte[]> stream;;

    public ListenerThread(KafkaStream<byte[], byte[]> msgStream) {
        this.stream = msgStream;

    }

    @Override
    public void run() {
        try {

            ConsumerIterator<byte[], byte[]> it = stream.iterator();

            while (it.hasNext()) {
                MessageAndMetadata<byte[], byte[]> messageAndMetadata = it.makeNext();
                String topic = messageAndMetadata.topic();
                byte[] message = messageAndMetadata.message();
                System.out.println("111111111111111111111111111");
                FileProcessor processor = new FileProcessor();
                processor.processFile(topic, message);
            }
} catch (ConsumerTimeoutException cte) {
            System.out.println("Consumer timed out");
        }

        catch (Exception ex) {
            ex.printStackTrace();
        }

    }
}
导入kafka.consumer.ConsumerIterator;
导入kafka.consumer.ConsumerTimeoutException;
进口kafka.consumer.KafkaStream;
导入kafka.message.MessageAndMetadata;
公共类ListenerThread实现可运行{
私人卡夫卡斯特林溪流;;
公共ListenerThread(KafkaStream msgStream){
this.stream=msgStream;
}
@凌驾
公开募捐{
试一试{
ConsumerIterator it=stream.iterator();
while(it.hasNext()){
MessageAndMetadata MessageAndMetadata=it.makeNext();
字符串topic=messageAndMetadata.topic();
byte[]message=messageAndMetadata.message();
System.out.println(“111111111111111111”);
FileProcessor=新的FileProcessor();
processFile(主题、消息);
}
}捕获(ConsumerTimeoutException cte){
System.out.println(“消费者超时”);
}
捕获(例外情况除外){
例如printStackTrace();
}
}
}

如果不想引发此异常,可以设置
consumer.timeout.ms=-1

2.11实际上是与kafka兼容的scala版本。可能你正在使用kafka 0.9.x?@nautilus哦..我在java中使用2.11..这是错的吗?java的最新版本是什么?你可以在这里查看卡夫卡的版本。如果你使用的是2.11,我想说的是,这不是卡夫卡的版本号。这并不能回答这个问题。若要评论或要求作者澄清,请在其帖子下方留下评论。-问题是如何克服kafka.consumer.ConsumerTimeoutException?将该属性设置为-1是克服这一问题的一种方法。@nautilus如果我们不提供“consumer.timeout.ms”属性,这意味着什么?@Ratha默认情况下
consumer.timeout.ms
的值为-1,如果没有新消息可供使用,则consumer会无限期地阻塞。通过将该值设置为正整数,如果在指定的超时值之后没有消息可供使用,则会向使用者抛出超时异常。您可以在本文档中找到配置信息。注意:为文档选择合适的kakfa版本。@JayaAnanthram.,因此,如果我们将该属性设置为3秒,并且在获得该异常后,如果新消息可用,消费者是否能够轮询这些新消息?