Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 处理器多次接收具有相同有效负载的多条消息_Java_Apache Kafka_Spring Cloud Stream_Spring Cloud Dataflow - Fatal编程技术网

Java 处理器多次接收具有相同有效负载的多条消息

Java 处理器多次接收具有相同有效负载的多条消息,java,apache-kafka,spring-cloud-stream,spring-cloud-dataflow,Java,Apache Kafka,Spring Cloud Stream,Spring Cloud Dataflow,我正在用“SpringCloudDataflow”启动一个新项目,开发一堆jar以满足我的需要 其中之一是一个处理器,用于卸载来自文件源的文件,该应用程序使用定制版本的integration zip,并具有处理tar和gunzip文件压缩的功能 所以我的问题是:当我的源代码发送一条带有文件引用的消息时,处理器会多次接收这些消息,相同的负载但id不同 如您所见,文件仅在消息中生成: 2017-10-02 12:38:28.013 INFO 17615 --- [ask-scheduler-3]

我正在用“SpringCloudDataflow”启动一个新项目,开发一堆jar以满足我的需要

其中之一是一个处理器,用于卸载来自文件源的文件,该应用程序使用定制版本的integration zip,并具有处理tar和gunzip文件压缩的功能

所以我的问题是:当我的源代码发送一条带有文件引用的消息时,处理器会多次接收这些消息,相同的负载但id不同

如您所见,文件仅在消息中生成:

2017-10-02 12:38:28.013  INFO 17615 --- [ask-scheduler-3] o.s.i.file.FileReadingMessageSource      : Created message: [GenericMessage [payload=/tmp/patent/CNINO_im_201733_batch108.tgz, headers={id=0b99b840-e3b3-f742-44ec-707aeea638c8, timestamp=1506940708013}]]
而producer有3条消息传入:

2017-10-02 12:38:28.077  INFO 17591 --- [           -L-1] o.s.i.codec.kryo.CompositeKryoRegistrar  : registering [40, java.io.File] with serializer org.springframework.integration.codec.kryo.FileSerializer
2017-10-02 12:38:28.080  INFO 17591 --- [           -L-1] .w.c.s.a.c.p.AbstractCompressTransformer : Message 'GenericMessage [payload=/tmp/patent/CNINO_im_201733_batch108.tgz, headers={kafka_offset=1, id=1
a4d4b9c-86fe-d3a8-d800-8013e8ae7027, kafka_receivedPartitionId=0, contentType=application/x-java-object;type=java.io.File, kafka_receivedTopic=untar.file, timestamp=1506940708079}]' unpacking started...
2017-10-02 12:38:28.080  INFO 17591 --- [           -L-1] .w.c.s.a.c.p.AbstractCompressTransformer : Check message's payload type to decompress
2017-10-02 12:38:29.106  INFO 17591 --- [           -L-1] .w.c.s.a.c.p.AbstractCompressTransformer : Message 'GenericMessage [payload=/tmp/patent/CNINO_im_201733_batch108.tgz, headers={kafka_offset=1, id=c
d611ca4-4cd9-0624-0871-dcf93a9a0051, kafka_receivedPartitionId=0, contentType=application/x-java-object;type=java.io.File, kafka_receivedTopic=untar.file, timestamp=1506940709106}]' unpacking started...
2017-10-02 12:38:29.107  INFO 17591 --- [           -L-1] .w.c.s.a.c.p.AbstractCompressTransformer : Check message's payload type to decompress
2017-10-02 12:38:31.108  INFO 17591 --- [           -L-1] .w.c.s.a.c.p.AbstractCompressTransformer : Message 'GenericMessage [payload=/tmp/patent/CNINO_im_201733_batch108.tgz, headers={kafka_offset=1, id=97171a2e-29ac-2111-b838-3da7220f5e3c, kafka_receivedPartitionId=0, contentType=application/x-java-object;type=java.io.File, kafka_receivedTopic=untar.file, timestamp=1506940711108}]' unpacking started...
2017-10-02 12:38:31.108  INFO 17591 --- [           -L-1] .w.c.s.a.c.p.AbstractCompressTransformer : Check message's payload type to decompress
2017-10-02 12:38:31.116 ERROR 17591 --- [           -L-1] o.s.integration.handler.LoggingHandler   : org.springframework.integration.transformer.MessageTransformationException: failed to transform message; nested exception is org.springframework.messaging.MessageHandlingException: Failed to apply Zip transformation.; nested exception is java.io.FileNotFoundException: /tmp/patent/CNINO_im_201733_batch108.tgz (File o directory non esistente), failedMessage=GenericMessage [payload=/tmp/patent/CNINO_im_201733_batch108.tgz, headers={kafka_offset=1, id=97171a2e-29ac-2111-b838-3da7220f5e3c, kafka_receivedPartitionId=0, contentType=application/x-java-object;type=java.io.File, kafka_receivedTopic=untar.file, timestamp=1506940711108}], failedMessage=GenericMessage [payload=/tmp/patent/CNINO_im_201733_batch108.tgz, headers={kafka_offset=1, id=97171a2e-29ac-2111-b838-3da7220f5e3c, kafka_receivedPartitionId=0, contentType=application/x-java-object;type=java.io.File, kafka_receivedTopic=untar.file, timestamp=1506940711108}]
        at org.springframework.integration.transformer.AbstractTransformer.transform(AbstractTransformer.java:44)
我找不到任何解决这个问题的方法,有没有人有同样的问题并找到了解决方法?或者我错过了什么配置

编辑:

我使用的是SDFS 1.2.2.RELEASE的本地版本,因此IO文件操作在同一个文件系统上工作,我使用的是SCS的Ditmars.BUILD-SNAPSHOT版本

不幸的是,如果我禁用文件删除操作应用程序,该应用程序仍然会多次处理消息。这里是一些代码片段,我希望您喜欢这是我的项目:

这是我的处理器类:

@EnableBinding(Processor.class)
@EnableConfigurationProperties(UnTarProperties.class)
public class UnTarProcessor {

  @Autowired
  private UnTarProperties properties;

  @Autowired
  private Processor processor;

  @Bean 
  public UncompressedResultSplitter splitter() {
    return new UncompressedResultSplitter();
  }

  @Bean 
  public UnTarGzTransformer transformer() {
    UnTarGzTransformer unTarGzTransformer = new UnTarGzTransformer(properties.isUseGzCompression());
    unTarGzTransformer.setExpectSingleResult(properties.isSingleResult());
    unTarGzTransformer.setWorkDirectory(new File(properties.getWorkDirectory()));
    unTarGzTransformer.setDeleteFiles(properties.isDeleteFile());

    return unTarGzTransformer;
  }

  @Bean  
  public IntegrationFlow process() {

    return IntegrationFlows.from(processor.input())
        .transform(transformer())
        .split(splitter())
        .channel(processor.output())
        .get();
  }
}
这是用于解压缩文件的核心方法:

  @Override
  protected Object doCompressTransform(final Message<?> message) throws Exception {
    logger.info(String.format("Message '%s' unpacking started...", message));

    try (InputStream checkMessage = checkMessage(message);
         InputStream inputStream = (gzCompression ? new BufferedInputStream(new GZIPInputStream(checkMessage)) : new BufferedInputStream(checkMessage))) {

      final Object payload = message.getPayload();
      final Object unzippedData;

      try (TarArchiveInputStream tarIn = new TarArchiveInputStream(inputStream)){        
        TarArchiveEntry entry = null;

        final SortedMap<String, Object> uncompressedData = new TreeMap<String, Object>();

        while ((entry = (TarArchiveEntry) tarIn.getNextEntry()) != null) {

          final String zipEntryName = entry.getName();
          final Date zipEntryTime = entry.getLastModifiedDate();
          final long zipEntryCompressedSize = entry.getSize();

          final String type = entry.isDirectory() ? "directory" : "file";

          final File tempDir = new File(workDirectory, message.getHeaders().getId().toString());
          tempDir.mkdirs(); // NOSONAR false positive

          final File destinationFile = new File(tempDir, zipEntryName);

          if (entry.isDirectory()) {
            destinationFile.mkdirs(); // NOSONAR false positive
          }
          else {
            unpackEntries(tarIn, entry, tempDir);
            uncompressedData.put(zipEntryName, destinationFile);
          }
        }

        if (uncompressedData.isEmpty()) {
          unzippedData = null;
        }
        else {
          if (this.expectSingleResult) {
            if (uncompressedData.size() == 1) {
              unzippedData = uncompressedData.values().iterator().next();
            }
            else {
              throw new MessagingException(message, String.format("The UnZip operation extracted %s " 
                        + "result objects but expectSingleResult was 'true'.", uncompressedData.size()));
            }
          }
          else {
            unzippedData = uncompressedData;
          }

        }

        logger.info("Payload unpacking completed...");
      }
      finally {
        if (payload instanceof File && this.deleteFiles) {
          final File filePayload = (File) payload;
          if (!filePayload.delete() && logger.isWarnEnabled()) {
            if (logger.isWarnEnabled()) {
              logger.warn("failed to delete File '" + filePayload + "'");
            }
          }
        }
      }
      return unzippedData;
    }
    catch (Exception e) {
      throw new MessageHandlingException(message, "Failed to apply Zip transformation.", e);
    }
}
方法checkmessage引发异常

    protected InputStream checkMessage(Message<?> message) throws FileNotFoundException {
      logger.info("Check message's payload type to decompress"); 

      InputStream inputStream;
      Object payload = message.getPayload();

      if (payload instanceof File) {
        final File filePayload = (File) payload;

          if (filePayload.isDirectory()) {
              throw new UnsupportedOperationException(String.format("Cannot unzip a directory: '%s'",
                      filePayload.getAbsolutePath()));
          }

          inputStream = new FileInputStream(filePayload);
        }
        else if (payload instanceof InputStream) {
            inputStream = (InputStream) payload;
        }
        else if (payload instanceof byte[]) {
            inputStream = new ByteArrayInputStream((byte[]) payload);
        }
        else {
            throw new IllegalArgumentException(String.format("Unsupported payload type '%s'. " +
                    "The only supported payload types are java.io.File, byte[] and java.io.InputStream",
                    payload.getClass().getSimpleName()));
        }

      return inputStream;
} 
我真的很感谢你的帮助。
非常感谢

我们需要更多信息。SCDF和SCS应用程序的版本。你的DSL至少取决于你如何部署你的应用


刚刚检查了日志,您是否意识到由于FileNotFoundException,您的消费者无法使用邮件?您没有多次收到相同的消息,SCS只是在失败之前尝试重新传递它。检查完整日志以及无法在指定位置打开文件的原因

变压器发生异常时,由于SCS重试配置,您多次收到消息,因为错误在您的逻辑中,很难理解。上面写着FileNotFoundException我不知道在你的过程中是什么把文件放在那里的,这可能就是原因。它似乎与SCS无关

带有文件引用的单个消息。那么,您应该确保您的处理器应用程序位于同一文件系统上,以便访问该文件。否则,很明显,另一台机器上没有/tmp/patent/CNINO_im_201733_batch108.tgz文件。我使用的是SCDF的本地版本,文件存储在两个应用程序都可读的临时目录中。当我在第一次消息传递后处理文件时,会引发FNF异常。我已经编辑了这篇文章,其中包含了关于我的配置和代码的更多信息