Apache flink 卡夫卡-弗林克连接错误显示NoSuchMethodError

Apache flink 卡夫卡-弗林克连接错误显示NoSuchMethodError,apache-flink,Apache Flink,当我从flinkkafkaconsumer09更改为flinkkafkaconsumer时,出现了新的错误 弗林克代码: import org.apache.flink.api.common.functions.MapFunction; import org.apache.flink.streaming.api.datastream.DataStream; import org.apache.flink.streaming.api.environment.StreamExecutionEnvir

当我从flinkkafkaconsumer09更改为flinkkafkaconsumer时,出现了新的错误 弗林克代码:

import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer;
import org.apache.flink.streaming.util.serialization.SimpleStringSchema;
import java.util.Properties;

@SuppressWarnings("deprecation")
public class ReadFromKafka {


  public static void main(String[] args) throws Exception {
    // create execution environment
    StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();

    Properties properties = new Properties();
    properties.setProperty("bootstrap.servers", "localhost:9092");
    properties.setProperty("group.id", "test-consumer-group");


    DataStream<String> stream = env
            .addSource(new FlinkKafkaConsumer<String>("test4", new SimpleStringSchema(), properties));

    stream.map(new MapFunction<String, String>() {
      private static final long serialVersionUID = -6867736771747690202L;

      @Override
      public String map(String value) throws Exception {
        return "Stream Value: " + value;
      }
    }).print();

    env.execute();
  }


}
import org.apache.flink.api.common.functions.MapFunction;
导入org.apache.flink.streaming.api.datastream.datastream;
导入org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
导入org.apache.flink.streaming.connectors.kafka.flinkkafconsumer;
导入org.apache.flink.streaming.util.serialization.SimpleStringSchema;
导入java.util.Properties;
@抑制警告(“弃用”)
公共类ReadFromKafka{
公共静态void main(字符串[]args)引发异常{
//创建执行环境
StreamExecutionEnvironment env=StreamExecutionEnvironment.getExecutionEnvironment();
属性=新属性();
setProperty(“bootstrap.servers”,“localhost:9092”);
properties.setProperty(“group.id”,“测试消费者组”);
数据流=环境
.addSource(新的FlinkKafkaConsumer(“test4”,新的SimpleStringSchema(),属性));
stream.map(新的MapFunction(){
私有静态最终长serialVersionUID=-6867736771747690202L;
@凌驾
公共字符串映射(字符串值)引发异常{
返回“流值:”+值;
}
}).print();
execute();
}
}
错误: log4j:WARN找不到记录器(org.apache.flink.api.java.ClosureCleaner)的追加器。 log4j:警告请正确初始化log4j系统。 log4j:有关更多信息,请参阅警告。 线程“main”org.apache.flink.runtime.client.JobExecutionException中出现异常:作业执行失败。 位于org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146) 位于org.apache.flink.runtime.minicluster.minicluster.executeJobBlocking(minicluster.java:626) 位于org.apache.flink.streaming.api.environment.LocalStreamEnvironment.execute(LocalStreamEnvironment.java:117) 位于org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1507) 位于org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1489) 位于ReadFromKafka.main(ReadFromKafka.java:33) 原因:org.apache.kafka.common.errors.TimeoutException:获取主题元数据时超时

pom.xml

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>com.dataartisans</groupId>
  <artifactId>kafka-example</artifactId>
  <version>0.0.1-SNAPSHOT</version>
  <name>kafkaex</name>
  <description>this is flink kafka example</description>
  <dependencies>
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-java</artifactId>
        <version>1.9.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-streaming-java_2.12</artifactId>
        <version>1.9.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-clients_2.12</artifactId>
        <version>1.9.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-connector-kafka_2.12</artifactId>
        <version>1.9.1</version>
    </dependency>
    <dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-core</artifactId>
    <version>1.9.1</version>
</dependency>

    <dependency>
        <groupId>com.googlecode.json-simple</groupId>
        <artifactId>json-simple</artifactId>
        <version>1.1</version>
    </dependency>  
</dependencies>
</project>

4.0.0
com.dataartisans
卡夫卡的例子
0.0.1-快照
卡夫卡
这是弗林克·卡夫卡的例子
org.apache.flink
弗林克爪哇
1.9.1
org.apache.flink
flink-streaming-java_2.12
1.9.1
org.apache.flink
flink-U 2.12
1.9.1
org.apache.flink
flink-connector-kafka_2.12
1.9.1
org.apache.flink
燧石芯
1.9.1
com.googlecode.json-simple
简单json
1.1

flink-connector-kafka_2.12
FlinkKafkaConsumer09
不兼容


flink-connector-kafka_2.12
是一种“通用”卡夫卡连接器,为与Scala 2.12配合使用而编译。从0.11.0开始,此通用连接器可与任何版本的卡夫卡一起使用

FlinkKafkaConsumer09
用于卡夫卡0.9.x。如果您的Kafka代理正在运行Kafka 0.9.x,则您需要
flink-connector-Kafka-0.9_2.11
flink-connector-Kafka-0.9_2.12
,具体取决于您想要的Scala版本

另一方面,如果您的卡夫卡经纪人正在运行最新版本的卡夫卡(0.11.0或更新版本),则请坚持使用
flink-connector-Kafka_2.12
并使用
flinkkafcummer
而不是
flinkkafcummer09


有关更多信息,请参阅。

关于更改flinkkafkaconsumer,出现了另一个错误。我不确定出了什么问题,但Flink不喜欢不产生任何结果的工作。尝试添加一个接收器——例如,
stream.print()。不是那个问题。实际上我在这里减少了代码,只是在eclipse中添加了print()函数。我不知道出了什么问题。请分享更多信息:导入、依赖项、完整的错误报告(一些详细信息从右侧裁剪)。与pom和errorkafka 2.12 flink 1.9.1一起添加