Maven 从卡夫卡控制台生成器将数据读入Flink程序
我创建了一个名为Maven 从卡夫卡控制台生成器将数据读入Flink程序,maven,apache-kafka,apache-flink,flink-streaming,apache-kafka-connect,Maven,Apache Kafka,Apache Flink,Flink Streaming,Apache Kafka Connect,我创建了一个名为test的主题,并使用console Producer在控制台中编写了一些字符串 ./bin/kafka-console-producer.sh --topic test --broker-list localhost:9092 幸运的是,我能够通过使用console consumer读取控制台中生成的数据。现在,我想使用下面的代码来使用Flink程序中的控制台使用者生成的输出 public class ReadFromKafka { public static v
test
的主题,并使用console Producer在控制台中编写了一些字符串
./bin/kafka-console-producer.sh --topic test --broker-list localhost:9092
幸运的是,我能够通过使用console consumer
读取控制台中生成的数据。现在,我想使用下面的代码来使用Flink程序中的控制台使用者
生成的输出
public class ReadFromKafka {
public static void main(String[] args) throws Exception {
// create execution environment
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
Properties properties = new Properties();
properties.setProperty("bootstrap.servers", "localhost:9092");
properties.setProperty("zookeeper.connect", "localhost:2181");
properties.setProperty("group.id", "test");
DataStream<String> message = env.addSource(new FlinkKafkaConsumer08<String>("test", new SimpleStringSchema(),properties));
message.map(new MapFunction<String, String>() {
private static final long serialVersionUID = -6867736771747690202L;
@Override
public String map(String value) throws Exception {
return " Value: " + value;
}
}).print();
env.execute();
} //main
} //ReadFromKafka
也是卡夫卡的一个版本,我使用下面的findby命令
find ./libs/ -name \*kafka_\* | head -1 | grep -o '\kafka[^\n]*'
是
我是否需要使用.8.x版本的Kafka来运行我的示例
评论和建议受到高度重视。提前谢谢。
祝你过得愉快
我的程序开始工作时做了以下更改,我将卡夫卡版本更新为.9.x
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka-0.9_2.10</artifactId>
<version>${flink.version}</version>
</dependency>
org.apache.flink
flink-connector-kafka-0.9_2.10
${flink.version}
我将Flink版本从1.0.1升级到1.1.2,如下所示
<properties>
<!-- <flink.version>1.0.1</flink.version>-->
<flink.version>1.1.2</flink.version>
<slf4j.version>1.7.7</slf4j.version>
<log4j.version>1.2.17</log4j.version>
</properties>
<dependencies>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.10</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.10</artifactId>
<version>${flink.version}</version>
</dependency>
1.1.2
1.7.7
1.2.17
org.slf4j
slf4j-log4j12
${slf4j.version}
log4j
log4j
${log4j.version}
org.apache.flink
flink-streaming-java_2.10
${flink.version}
org.apache.flink
弗林克爪哇
${flink.version}
org.apache.flink
flink-U 2.10
${flink.version}
如果这解决了您的问题,您能否将其标记为解决问题的答案?
kafka_2.11-0.9.0.0-javadoc.jar
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka-0.9_2.10</artifactId>
<version>${flink.version}</version>
</dependency>
<properties>
<!-- <flink.version>1.0.1</flink.version>-->
<flink.version>1.1.2</flink.version>
<slf4j.version>1.7.7</slf4j.version>
<log4j.version>1.2.17</log4j.version>
</properties>
<dependencies>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.10</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.10</artifactId>
<version>${flink.version}</version>
</dependency>