Java Kafka producer无法将数据发送到服务器
这是我的密码。我能够创建主题,但由于某些原因无法在主题内部发送数据。我在很长一段时间后发现了这些错误。我正在使用卡夫卡版本2.11-0.8.2.1Java Kafka producer无法将数据发送到服务器,java,apache-kafka,kafka-producer-api,Java,Apache Kafka,Kafka Producer Api,这是我的密码。我能够创建主题,但由于某些原因无法在主题内部发送数据。我在很长一段时间后发现了这些错误。我正在使用卡夫卡版本2.11-0.8.2.1 org.apache.kafka.clients.producer.KafkaProducer$FutureFailure@5474c6c org.apache.kafka.clients.producer.KafkaProducer$FutureFailure@4b6995df 这是kafka的server.log文件 [2016-12-27 2
org.apache.kafka.clients.producer.KafkaProducer$FutureFailure@5474c6c
org.apache.kafka.clients.producer.KafkaProducer$FutureFailure@4b6995df
这是kafka的server.log文件
[2016-12-27 21:05:54,873] ERROR Closing socket for /127.0.0.1 because of error (kafka.network.Processor)
java.io.IOException: An established connection was aborted by the software in your host machine
at sun.nio.ch.SocketDispatcher.read0(Native Method)
at sun.nio.ch.SocketDispatcher.read(Unknown Source)
at sun.nio.ch.IOUtil.readIntoNativeBuffer(Unknown Source)
at sun.nio.ch.IOUtil.read(Unknown Source)
at sun.nio.ch.SocketChannelImpl.read(Unknown Source)
at kafka.utils.Utils$.read(Utils.scala:380)
at kafka.network.BoundedByteBufferReceive.readFrom(BoundedByteBufferReceive.scala:54)
at kafka.network.Processor.read(SocketServer.scala:444)
at kafka.network.Processor.run(SocketServer.scala:340)
at java.lang.Thread.run(Unknown Source)
[2016-12-27 21:07:54,727] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor)
[2016-12-27 21:16:08,559] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor)
以下是我向kafka系统发送整数的java代码:
Properties props = new Properties();
props.put("bootstrap.servers", "localhost:9092");
props.put("acks", "all");
props.put("retries", 0);
props.put("batch.size", 16384);
props.put("linger.ms", 1);
props.put("buffer.memory", 33554432);
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("timeout.ms", "50");
Producer<String, String> producer = new KafkaProducer<>(props);
for(int i = 0; i < 2; i++)
System.out.println(producer.send(new ProducerRecord<String, String>("testtopic", Integer.toString(i),
Integer.toString(i))).toString());
producer.close();
Properties=newproperties();
put(“bootstrap.servers”,“localhost:9092”);
道具放置(“阿克斯”、“全部”);
道具放置(“重试”,0);
道具放置(“批量大小”,16384);
道具放置(“玲儿小姐”,1);
props.put(“buffer.memory”,33554432);
put(“key.serializer”、“org.apache.kafka.common.serialization.StringSerializer”);
put(“value.serializer”、“org.apache.kafka.common.serialization.StringSerializer”);
道具放置(“timeout.ms”,“50”);
制作人=新卡夫卡制作人(道具);
对于(int i=0;i<2;i++)
System.out.println(producer.send(新ProducerRecord)(“testtopic”,Integer.toString(i),
整数.toString(i)).toString();
producer.close();
下面是pom.xml
<dependencies>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>0.10.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.11</artifactId>
<version>0.8.2.1</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>1.6.4</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.16</version>
<exclusions>
<exclusion>
<groupId>javax.jms</groupId>
<artifactId>jms</artifactId>
</exclusion>
</exclusions>
</dependency>
</dependencies>
org.apache.kafka
卡夫卡客户
0.10.1.0
org.apache.kafka
卡夫卡2.11
0.8.2.1
org.slf4j
slf4j简单
1.6.4
log4j
log4j
1.2.16
javax.jms
jms
除了
props.put("timeout.ms", "50");
请求超时应大于默认轮询间隔,在Kafka中,默认轮询间隔为5分钟。所以我想,如果将其保留为默认值(略高于5分钟),它应该可以工作 除了
props.put("timeout.ms", "50");
请求超时应大于默认轮询间隔,在Kafka中,默认轮询间隔为5分钟。所以我想,如果将其保留为默认值(略高于5分钟),它应该可以工作 我将卡夫卡版本降级为Kafka_2.10-0.9.0.0,以下属性适用于它
Properties props = new Properties();
props.put("metadata.broker.list", "localhost:9092");
props.put("acks", "all");
props.put("retries", 0);
props.put("batch.size", 16384);
props.put("linger.ms", 1);
props.put("buffer.memory", 33554432);
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
props.put("serializer.class", "kafka.serializer.StringEncoder");
ProducerConfig producerConfig = new ProducerConfig(props);
kafka.javaapi.producer.Producer<String, String> producer
= new kafka.javaapi.producer.Producer<String, String>(producerConfig);
Properties=newproperties();
put(“metadata.broker.list”,“localhost:9092”);
道具放置(“阿克斯”、“全部”);
道具放置(“重试”,0);
道具放置(“批量大小”,16384);
道具放置(“玲儿小姐”,1);
props.put(“buffer.memory”,33554432);
put(“key.serializer”、“org.apache.kafka.common.serialization.StringSerializer”);
put(“value.serializer”、“org.apache.kafka.common.serialization.StringSerializer”);
put(“key.deserializer”、“org.apache.kafka.common.serialization.StringDeserializer”);
put(“value.deserializer”、“org.apache.kafka.common.serialization.StringDeserializer”);
put(“serializer.class”、“kafka.serializer.StringEncoder”);
ProducerConfig ProducerConfig=新ProducerConfig(道具);
kafka.javaapi.producer.producer
=新的kafka.javaapi.producer.producer(producerConfig);
我的Pom.xml文件如下所示:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>TwitterKafkaPostgre</groupId>
<artifactId>TwitterKafkaPostgre</artifactId>
<version>0.0.1-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>com.twitter</groupId>
<artifactId>hbc-core</artifactId> <!-- or hbc-twitter4j -->
<version>2.2.0</version> <!-- or whatever the latest version is -->
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>0.9.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.11</artifactId>
<version>0.9.0.0</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.16</version>
<exclusions>
<exclusion>
<groupId>javax.jms</groupId>
<artifactId>jms</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>1.6.4</version>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>18.0</version>
</dependency>
4.0.0
推特卡夫卡
推特卡夫卡
0.0.1-快照
com.twitter
hbc核心
2.2.0
org.apache.kafka
卡夫卡客户
0.9.0.0
org.apache.kafka
卡夫卡2.11
0.9.0.0
log4j
log4j
1.2.16
javax.jms
jms
org.slf4j
slf4j简单
1.6.4
番石榴
番石榴
18
我将卡夫卡版本降级为Kafka_2.10-0.9.0.0,以下属性适用于它
Properties props = new Properties();
props.put("metadata.broker.list", "localhost:9092");
props.put("acks", "all");
props.put("retries", 0);
props.put("batch.size", 16384);
props.put("linger.ms", 1);
props.put("buffer.memory", 33554432);
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
props.put("serializer.class", "kafka.serializer.StringEncoder");
ProducerConfig producerConfig = new ProducerConfig(props);
kafka.javaapi.producer.Producer<String, String> producer
= new kafka.javaapi.producer.Producer<String, String>(producerConfig);
Properties=newproperties();
put(“metadata.broker.list”,“localhost:9092”);
道具放置(“阿克斯”、“全部”);
道具放置(“重试”,0);
道具放置(“批量大小”,16384);
道具放置(“玲儿小姐”,1);
props.put(“buffer.memory”,33554432);
put(“key.serializer”、“org.apache.kafka.common.serialization.StringSerializer”);
put(“value.serializer”、“org.apache.kafka.common.serialization.StringSerializer”);
put(“key.deserializer”、“org.apache.kafka.common.serialization.StringDeserializer”);
put(“value.deserializer”、“org.apache.kafka.common.serialization.StringDeserializer”);
put(“serializer.class”、“kafka.serializer.StringEncoder”);
ProducerConfig ProducerConfig=新ProducerConfig(道具);
kafka.javaapi.producer.producer
=新的kafka.javaapi.producer.producer(producerConfig);
我的Pom.xml文件如下所示:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>TwitterKafkaPostgre</groupId>
<artifactId>TwitterKafkaPostgre</artifactId>
<version>0.0.1-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>com.twitter</groupId>
<artifactId>hbc-core</artifactId> <!-- or hbc-twitter4j -->
<version>2.2.0</version> <!-- or whatever the latest version is -->
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>0.9.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.11</artifactId>
<version>0.9.0.0</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.16</version>
<exclusions>
<exclusion>
<groupId>javax.jms</groupId>
<artifactId>jms</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>1.6.4</version>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>18.0</version>
</dependency>
4.0.0
推特卡夫卡
推特卡夫卡
0.0.1-快照
com.twitter
hbc核心
2.2.0
org.apache.kafka
卡夫卡客户
0.9.0.0
org.apache.kafka
卡夫卡2.11
0.9.0.0
log4j
log4j
1.2.16
javax.jms
jms
org.slf4j
slf4j简单
1.6.4
番石榴
番石榴
18
我试着注释这句话,但仍然存在同样的问题。pom.xml中可能有错误。下面是我正在使用的依赖项。我使用的是卡夫卡2.11-0.8.2.1版本,我试着对这句话进行评论,但仍然存在同样的问题。pom.xml中可能有错误。下面是我正在使用的依赖项。我正在使用kafka_2.11-0.8.2.1版本控制发送数据,而不使用System.out.println();让我知道发生了什么?尝试在没有System.out.println()的情况下发送数据;让我知道发生了什么?