Apache flink 弗林克没有将数据写入卡夫卡主题

Apache flink 弗林克没有将数据写入卡夫卡主题,apache-flink,Apache Flink,我写了一个flink代码,它从文件夹中读取csv文件,并接收卡夫卡主题的数据 Name Bytes received Records received Records sent Source: 0 B 0 1 Split Reader 1.12 KB 1 10 Sink: Unnamed 1.79 KB

我写了一个flink代码,它从文件夹中读取csv文件,并接收卡夫卡主题的数据

Name           Bytes received   Records received    Records sent
Source:        0 B              0                   1       
Split Reader   1.12 KB          1                   10      
Sink: Unnamed  1.79 KB          10                  0   
这是我在弗林克的工作:

final StreamExecutionEnvironment env = 
StreamExecutionEnvironment.getExecutionEnvironment();

org.apache.flink.core.fs.Path filePath = new           
org.apache.flink.core.fs.Path(feedFileFolder);

RowCsvInputFormat format = new RowCsvInputFormat(filePath, 
FetchTypeInformation.getTypeInformation());


DataStream<Row> inputStream = env.readFile(format, feedFileFolder, 
FileProcessingMode.PROCESS_CONTINUOUSLY,
parseInt(folderLookupTime));


DataStream<String> speStream = inputStream.filter(new FilterFunction<Row> 
().map(new MapFunction<Row, String>() {
@Override
public String map(Row row) {

            ...............
            return resultingJsonString;
        }
    });

Properties props = Producer.getProducerConfig(propertiesFilePath);

speStream.addSink(new FlinkKafkaProducer011(kafkaTopicName, new
    KeyedSerializationSchemaWrapper<>(new SimpleStringSchema()), props,
     FlinkKafkaProducer011.Semantic.EXACTLY_ONCE));
但是,当我在flink服务器上执行这个(作为一个jar)时,该作业无法接收关于kafka主题的数据。Flink UI如下所示:

Name           Bytes received   Records received    Records sent
Source:        0 B              0                   1       
Split Reader   616 B            1                   0       
Sink: Unnamed  450 B            0                   0

1.如果文件夹为空,我启动flink作业,然后在该文件夹中添加一个新文件,则会出现上述情况。如果文件夹中已经放置了任何文件,然后我启动flink作业,那么它工作正常,并将数据放入kafka。您是否可以检查
TaskManager
的调试日志
SplitReader
是否真的开始读取拆分?如果是,那么您应该看到
读取拆分: