Spring batch Spring批处理分区不工作

Spring batch Spring批处理分区不工作,spring-batch,Spring Batch,我使用Spring批处理分区将相关平面文件组中的数据合并到单个文件中。批次失败,出现以下两个问题: 第一个从属步骤线程出现故障,因为在文件写入器打开之前已将数据写入文件写入器。此线程的变量InputFileName(partitioner提供的步骤上下文数据)的值为[20002],20003] 第二个从属步骤线程失败,因为步骤上下文中缺少分区数据。此线程的变量InputFileName的值为null 如果配置中缺少某些内容,请告诉我 // log with Error info 2015-12

我使用Spring批处理分区将相关平面文件组中的数据合并到单个文件中。批次失败,出现以下两个问题:

  • 第一个从属步骤线程出现故障,因为在文件写入器打开之前已将数据写入文件写入器。此线程的变量InputFileName(partitioner提供的步骤上下文数据)的值为[20002],20003]
  • 第二个从属步骤线程失败,因为步骤上下文中缺少分区数据。此线程的变量InputFileName的值为null
  • 如果配置中缺少某些内容,请告诉我

    // log with Error info
    
    2015-12-26 17:59:14,165 DEBUG [SimpleAsyncTaskExecutor-1] c.d.d.b.r.ReaderConfiguration [ReaderBatchConfiguration.java:473] inputFileNames ----[20002", 20003]
    
        2015-12-26 17:59:14,165 DEBUG [SimpleAsyncTaskExecutor-1] c.d.d.b.r.BatchConfiguration [BatchConfiguration.java:389] consumer ----p2
        2015-12-26 17:59:14,275 ERROR [SimpleAsyncTaskExecutor-1] o.s.b.c.s.AbstractStep [AbstractStep.java:225] Encountered an error executing step testConsumersInputFileMergeStep in job testFileForInputJob
        org.springframework.batch.item.WriterNotOpenException: Writer must be open before it can be written to
        at org.springframework.batch.item.file.FlatFileItemWriter.write(FlatFileItemWriter.java:255) ~[spring-batch-infrastructure-3.0.3.RELEASE.jar:3.0.3.RELEASE]
    
        2015-12-26 18:00:14,421 DEBUG [SimpleAsyncTaskExecutor-2] c.d.d.b.r.ReaderBatchConfiguration [ReaderConfiguration.java:474] inputFileNames ----null
    
    //分割者

        public class ProvisioningInputFilePartitioner implements Partitioner {
    
        @Override
        public Map<String, ExecutionContext> partition(int gridSize) {
            Map<String, ExecutionContext> filesToProcess = getFilesToProcess(outboundSourceFolder); 
            Map<String, ExecutionContext> execCtxs = new HashMap<>();
            for(Entry<String, ExecutionContext> entry : filesToProcess.entrySet()) {
                execCtxs.put(entry.getKey(), entry.getValue());  
            }
    
            return execCtxs;
        }
    
        private Map<String, ExecutionContext> getFilesToProcess(String outboundSourceFolder2) {
            Map<String, ExecutionContext> contexts = new HashMap<>();
            ExecutionContext execCtx1 = new ExecutionContext();
            List<String> inputFileNames1 =  Arrays.asList("20001", "22222");
            execCtx1.put("consumer", "p1");
            execCtx1.put("inputFileNames", inputFileNames1);
    
            contexts.put("p1", execCtx1);
    
            ExecutionContext execCtx2 = new ExecutionContext();
            List<String> inputFileNames2 =  Arrays.asList("20002", "20003");
            execCtx1.put("consumer", "p2");
            execCtx1.put("inputFileNames", inputFileNames2);
    
            contexts.put("p2", execCtx2);
    
            return contexts;
        }
    }
    
    //工作


    对于第一个问题,请参见我的回答:对于第二个问题,您确认了
    execCtxs
    的内容了吗?谢谢Michael,您的建议解决了我的问题。对于第一个问题,请参见我的回答:对于第二个问题,您确认了
    execCtxs
    的内容了吗?谢谢Michael,您的建议解决了我的问题问题
      @Bean
      @StepScope
      public ItemWriter<String> testConsumerFileItemWriter (@Value("#{stepExecutionContext[consumer]}") String consumer){
          logger.debug("consumer ----"+ consumer);
          FileSystemResource fileSystemResource = new FileSystemResource(new File(outboundSourceFolder, consumer + ".txt"));
          FlatFileItemWriter<String> fileItemWriter = new FlatFileItemWriter<>();
          fileItemWriter.setResource(fileSystemResource);
          fileItemWriter.setLineAggregator(new PassThroughLineAggregator<String>());
          return fileItemWriter;
      }
    
      @Bean
      public Partitioner provisioningInputFilePartitioner() {
          return new ProvisioningInputFilePartitioner();
      }
    
      @Bean
      public TaskExecutor taskExecutor() {
            return new SimpleAsyncTaskExecutor();
      }
    
    @Bean
    @StepScope
    public ItemReader<String> testInputFilesReader (@Value("#{stepExecutionContext[inputFileNames]}") List<String> inputFileNames) {
        logger.debug("inputFileNames ----" + inputFileNames);
        MultiResourceItemReader<String> multiResourceItemReader = new MultiResourceItemReader<String>();
    
        ...
        return multiResourceItemReader;     
    }
    
    @Bean
    public Step testConsumersInputFileMergeStep(StepBuilderFactory stepBuilder, ItemReader<String> testInputFilesReader,
            ItemWriter<String> testConsumerFileItemWriter){
        return stepBuilder.get("testConsumersInputFileMergeStep").<String, String>chunk(1).reader(testInputFilesReader)
                .writer(testConsumerFileItemWriter).build();
    }
    
    @Bean 
        public Step testConsumersFilePartitionerStep(StepBuilderFactory stepBuilder, Step testConsumersInputFileMergeStep, Partitioner provisioningInputFilePartitioner,
                TaskExecutor taskExecutor   ){
            return stepBuilder.get("testConsumersFilePartitionerStep").partitioner(testConsumersInputFileMergeStep)
                    .partitioner("testConsumersInputFileMergeStep", provisioningInputFilePartitioner)
                    .taskExecutor(taskExecutor)
                    .build();
        }
    
    @Bean
    public Job testFileForInputJob(JobBuilderFactory factory, Step testFileForInputStep, Step testConsumersFilePartitionerStep) {
        return factory.get("testFileForInputJob").incrementer(new RunIdIncrementer()).start(testConsumersFilePartitionerStep).build();
    }