Java 弗林克&x2B;卡夫卡:getHostnamePort

Java 弗林克&x2B;卡夫卡:getHostnamePort,java,apache-kafka,apache-flink,Java,Apache Kafka,Apache Flink,我想读弗林克的卡夫卡主题 package Toletum.pruebas; import org.apache.flink.api.common.functions.MapFunction; import org.apache.flink.api.java.utils.ParameterTool; import org.apache.flink.streaming.api.datastream.DataStream; import org.apache.flin

我想读弗林克的卡夫卡主题

package Toletum.pruebas; import org.apache.flink.api.common.functions.MapFunction; import org.apache.flink.api.java.utils.ParameterTool; import org.apache.flink.streaming.api.datastream.DataStream; import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment; import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082; import org.apache.flink.streaming.util.serialization.SimpleStringSchema; public class LeeKafka { public static void main(String[] args) throws Exception { final ParameterTool parameterTool = ParameterTool.fromArgs(args); // create execution environment StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment(); FlinkKafkaConsumer082 kafkaSrc = new FlinkKafkaConsumer082("test02", new SimpleStringSchema(), parameterTool.getProperties()); DataStream messageStream = env.addSource(kafkaSrc); messageStream.rebalance().map(new MapFunction() { private static final long serialVersionUID = -6867736771747690202L; public String map(String value) throws Exception { return "Kafka and Flink says: " + value; } }).print(); env.execute("LeeKafka"); } } 包装Toletum.pruebas; 导入org.apache.flink.api.common.functions.MapFunction; 导入org.apache.flink.api.java.utils.ParameterTool; 导入org.apache.flink.streaming.api.datastream.datastream; 导入org.apache.flink.streaming.api.environment.StreamExecutionEnvironment; 导入org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082; 导入org.apache.flink.streaming.util.serialization.SimpleStringSchema; 公共级李卡夫卡{ 公共静态void main(字符串[]args)引发异常{ final ParameterTool ParameterTool=ParameterTool.fromArgs(args); //创建执行环境 StreamExecutionEnvironment env=StreamExecutionEnvironment.getExecutionEnvironment(); FlinkKafkaConsumer082 kafkaSrc=新的FlinkKafkaConsumer082(“test02”, 新的SimpleStringSchema(), 参数tool.getProperties()); DataStream messageStream=env.addSource(kafkaSrc); messageStream.rebalance().map(新的映射函数(){ 私有静态最终长serialVersionUID=-6867736771747690202L; 公共字符串映射(字符串值)引发异常{ return“卡夫卡和弗林克说:”+价值; } }).print(); 环境执行(“利卡夫卡”); } } 此代码成功运行:

java -cp Package.jar Toletum.pruebas.LeeKafka --topic test02 --bootstrap.servers kafka:9092 --zookeeper.connect zookeeper:2181 --group.id myGroup java-cp Package.jar Toletum.pruebas.LeeKafka--topic test02--bootstrap.servers kafka:9092--zookeeper.connect zookeeper:2181--group.id myGroup 但是,当我尝试使用flink时:

flink run -c Toletum.pruebas.LeeKafka pruebas-0.0.1-SNAPSHOT-jar-with-dependencies.jar --topic test02 --bootstrap.servers kafka:9092 --zookeeper.connect zookeeper:2181 --group.id myGroup flink run-c Toletum.pruebas.LeeKafka pruebas-0.0.1-SNAPSHOT-jar-with-dependencies.jar--topic test02--bootstrap.servers-kafka:9092--zookeeper.connect zookeeper:2181--group.id myGroup 我得到一个错误:

java.lang.NoSuchMethodError: org.apache.flink.util.NetUtils.getHostnamePort(Ljava/lang/String;)Ljava/net/URL; at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.getPartitionsForTopic(FlinkKafkaConsumer.java:592) at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.(FlinkKafkaConsumer.java:280) at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082.(FlinkKafkaConsumer082.java:49) at Toletum.pruebas.LeeKafka.main(LeeKafka.java:22) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:497) at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:395) at org.apache.flink.client.program.Client.runBlocking(Client.java:252) at org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:676) at org.apache.flink.client.CliFrontend.run(CliFrontend.java:326) at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:978) at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1028) java.lang.NoSuchMethodError:org.apache.flink.util.NetUtils.getHostnamePort(Ljava/lang/String;)Ljava/net/URL; 在org.apache.flink.streaming.connectors.kafka.flinkkafconsumer.getPartitionsForTopic(flinkkafconsumer.java:592) 在org.apache.flink.streaming.connectors.kafka.flinkkafconsumer.(flinkkafconsumer.java:280) 在org.apache.flink.streaming.connectors.kafka.flinkkafconsumer082.(flinkkafconsumer082.java:49) 位于Toletum.pruebas.LeeKafka.main(LeeKafka.java:22) 在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处 在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)中 在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)中 位于java.lang.reflect.Method.invoke(Method.java:606) 位于org.apache.flink.client.program.PackagedProgram.callmain方法(PackagedProgram.java:497) 位于org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:395) 位于org.apache.flink.client.program.client.runBlocking(client.java:252) 位于org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:676) 位于org.apache.flink.client.CliFrontend.run(CliFrontend.java:326) 位于org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:978) 位于org.apache.flink.client.CliFrontend.main(CliFrontend.java:1028) 旧版本库

正确的pom.xml:



            <dependency>
                    <groupId>org.apache.flink</groupId>
                    <artifactId>flink-connector-kafka</artifactId>
                    <version>0.10.1</version>
            </dependency>


此问题是由于使用旧版本的FLink Connector library造成的

您可以查看最新的可用库并下载最新的Maven依赖项

您正在使用的卡夫卡版本也应予以考虑

尝试使用Flink文档中Kafka连接器的最新Maven依赖项

最新的maven依赖项是

<dependency>
  <groupId>org.apache.flink</groupId>
  <artifactId>flink-connector-kafka-0.8_2.10</artifactId>
  <version>1.3.2</version>
</dependency>

org.apache.flink
flink-connector-kafka-0.8_2.10
1.3.2

是不是编译作业时使用的版本与群集上运行的Flink版本不相等?谢谢。。。。我在pom.xml中使用了旧版本