Spring batch 异常为java.lang.IllegalStateException:无法确定回复主题标题和默认回复主题

Spring batch 异常为java.lang.IllegalStateException:无法确定回复主题标题和默认回复主题,spring-batch,spring-integration,Spring Batch,Spring Integration,我正在尝试使用Kafka作为中间层实现远程分区,并获得异常。就从这个主题开始,开始只处理master,一旦处理完master,就会处理worker端代码 下面是异常的堆栈跟踪 2020-10-16 00:27:48.640 INFO 13716 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka version: 2.5.0 2020-10-16 00:27:48.641 INFO 13716 ---

我正在尝试使用Kafka作为中间层实现远程分区,并获得异常。就从这个主题开始,开始只处理master,一旦处理完master,就会处理worker端代码

下面是异常的堆栈跟踪

2020-10-16 00:27:48.640  INFO 13716 --- [           main] o.a.kafka.common.utils.AppInfoParser     : Kafka version: 2.5.0
2020-10-16 00:27:48.641  INFO 13716 --- [           main] o.a.kafka.common.utils.AppInfoParser     : Kafka commitId: 66563e712b0b9f84
2020-10-16 00:27:48.641  INFO 13716 --- [           main] o.a.kafka.common.utils.AppInfoParser     : Kafka startTimeMs: 1602788268633
2020-10-16 00:27:48.652  INFO 13716 --- [           main] o.a.k.clients.consumer.KafkaConsumer     : [Consumer clientId=consumer-testPokRequestsorab-1, groupId=testPokRequestsorab] Subscribed to topic(s): reply
2020-10-16 00:27:48.657  INFO 13716 --- [           main] o.s.s.c.ThreadPoolTaskScheduler          : Initializing ExecutorService
2020-10-16 00:27:48.670  INFO 13716 --- [           main] o.s.i.endpoint.EventDrivenConsumer       : Adding {kafka:outbound-gateway} as a subscriber to the 'requests' channel
2020-10-16 00:27:48.671  INFO 13716 --- [           main] o.s.integration.channel.DirectChannel    : Channel 'application-1.requests' has 1 subscriber(s).
2020-10-16 00:27:48.671  INFO 13716 --- [           main] o.s.i.endpoint.EventDrivenConsumer       : started bean 'outboundGateFlow.org.springframework.integration.config.ConsumerEndpointFactoryBean#0'; defined in: 'io.spring.batch.configuration.MasterActualConf'; from source: 'bean method outboundGateFlow'
2020-10-16 00:27:48.751  INFO 13716 --- [           main] o.s.b.w.embedded.tomcat.TomcatWebServer  : Tomcat started on port(s): 8080 (http) with context path ''
2020-10-16 00:27:49.940  INFO 13716 --- [           main] DeferredRepositoryInitializationListener : Triggering deferred initialization of Spring Data repositories…
2020-10-16 00:27:49.941  INFO 13716 --- [           main] DeferredRepositoryInitializationListener : Spring Data repositories initialized!
2020-10-16 00:27:49.972  INFO 13716 --- [           main] i.s.b.configuration.MasterActualConf     : Started MasterActualConf in 29.713 seconds (JVM running for 32.146)
2020-10-16 00:27:49.976  INFO 13716 --- [           main] o.s.b.a.b.JobLauncherApplicationRunner   : Running default command line with: []
************************Inside  batchConfigurer    ****************
2020-10-16 00:27:50.017  INFO 13716 --- [           main] com.zaxxer.hikari.HikariDataSource       : HikariPool-2 - Starting...
2020-10-16 00:27:50.017  WARN 13716 --- [           main] com.zaxxer.hikari.util.DriverDataSource  : Registered driver with driverClassName=org.hsqldb.jdbcDriver was not found, trying direct instantiation.
2020-10-16 00:27:50.018  INFO 13716 --- [           main] com.zaxxer.hikari.pool.PoolBase          : HikariPool-2 - Driver does not support get/set network timeout for connections. (feature not supported)
2020-10-16 00:27:50.019  INFO 13716 --- [           main] com.zaxxer.hikari.HikariDataSource       : HikariPool-2 - Start completed.
2020-10-16 00:27:50.290  INFO 13716 --- [           main] o.s.b.c.l.support.SimpleJobLauncher      : Job: [SimpleJob: [name=remotePartitioningJobMy]] launched with the following parameters: [{run.id=44}]
2020-10-16 00:27:50.408  WARN 13716 --- [           main] o.s.c.t.b.l.TaskBatchExecutionListener   : This job was executed outside the scope of a task but still used the task listener.
2020-10-16 00:27:50.426  INFO 13716 --- [           main] o.s.batch.core.job.SimpleStepHandler     : Executing step: [masterStep]
2020-10-16 00:27:50.473 DEBUG 13716 --- [           main] o.s.integration.channel.DirectChannel    : preSend on channel 'bean 'requests'; defined in: 'io.spring.batch.configuration.MasterActualConf'; from source: 'org.springframework.core.type.StandardMethodMetadata@4263b080'', message: GenericMessage [payload=StepExecutionRequest: [jobExecutionId=64, stepExecutionId=94, stepName=workerStep], headers={sequenceNumber=0, replyChannel=bean 'org.springframework.integration.dsl.StandardIntegrationFlow#0.channel#0', correlationId=64:workerStep, id=8b7b56e5-a1a5-cc70-9ee0-ce0d1c077f34, sequenceSize=1, timestamp=1602788270472}]
2020-10-16 00:27:50.474 DEBUG 13716 --- [           main] o.s.i.k.o.KafkaProducerMessageHandler    : bean 'outboundGateFlow.kafka:outbound-gateway#0' for component 'outboundGateFlow.org.springframework.integration.config.ConsumerEndpointFactoryBean#0'; defined in: 'io.spring.batch.configuration.MasterActualConf'; from source: 'bean method outboundGateFlow' received message: GenericMessage [payload=StepExecutionRequest: [jobExecutionId=64, stepExecutionId=94, stepName=workerStep], headers={sequenceNumber=0, replyChannel=bean 'org.springframework.integration.dsl.StandardIntegrationFlow#0.channel#0', correlationId=64:workerStep, id=8b7b56e5-a1a5-cc70-9ee0-ce0d1c077f34, sequenceSize=1, timestamp=1602788270472}]
2020-10-16 00:27:50.491 ERROR 13716 --- [           main] o.s.batch.core.step.AbstractStep         : Encountered an error executing step masterStep in job remotePartitioningJobMy

org.springframework.messaging.MessageHandlingException: error occurred in message handler [bean 'outboundGateFlow.kafka:outbound-gateway#0' for component 'outboundGateFlow.org.springframework.integration.config.ConsumerEndpointFactoryBean#0'; defined in: 'io.spring.batch.configuration.MasterActualConf'; from source: 'bean method outboundGateFlow']; nested exception is java.lang.IllegalStateException: No reply topic header and no default reply topic is can be determined
    at org.springframework.integration.support.utils.IntegrationUtils.wrapInHandlingExceptionIfNecessary(IntegrationUtils.java:192)
    at org.springframework.integration.handler.AbstractMessageHandler.handleMessage(AbstractMessageHandler.java:79)
    at org.springframework.integration.dispatcher.AbstractDispatcher.tryOptimizedDispatch(AbstractDispatcher.java:115)
    at org.springframework.integration.dispatcher.UnicastingDispatcher.doDispatch(UnicastingDispatcher.java:133)
    at org.springframework.integration.dispatcher.UnicastingDispatcher.dispatch(UnicastingDispatcher.java:106)
    at org.springframework.integration.channel.AbstractSubscribableChannel.doSend(AbstractSubscribableChannel.java:72)
    at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:570)
    at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:520)
    at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:187)
    at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:166)
    at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:47)
    at org.springframework.messaging.core.AbstractMessageSendingTemplate.send(AbstractMessageSendingTemplate.java:109)
    at org.springframework.messaging.core.AbstractMessageSendingTemplate.send(AbstractMessageSendingTemplate.java:99)
    at org.springframework.batch.integration.partition.MessageChannelPartitionHandler.handle(MessageChannelPartitionHandler.java:228)
    at org.springframework.batch.core.partition.support.PartitionStep.doExecute(PartitionStep.java:106)
    at org.springframework.batch.core.step.AbstractStep.execute(AbstractStep.java:208)
    at org.springframework.batch.core.job.SimpleStepHandler.handleStep(SimpleStepHandler.java:148)
    at org.springframework.batch.core.job.AbstractJob.handleStep(AbstractJob.java:410)
    at org.springframework.batch.core.job.SimpleJob.doExecute(SimpleJob.java:136)
    at org.springframework.batch.core.job.AbstractJob.execute(AbstractJob.java:319)
    at org.springframework.batch.core.launch.support.SimpleJobLauncher$1.run(SimpleJobLauncher.java:147)
    at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:50)
    at org.springframework.batch.core.launch.support.SimpleJobLauncher.run(SimpleJobLauncher.java:140)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:344)
    at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:198)
    at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163)
    at org.springframework.batch.core.configuration.annotation.SimpleBatchConfiguration$PassthruAdvice.invoke(SimpleBatchConfiguration.java:127)
    at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186)
    at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:212)
    at com.sun.proxy.$Proxy98.run(Unknown Source)
    at org.springframework.boot.autoconfigure.batch.JobLauncherApplicationRunner.execute(JobLauncherApplicationRunner.java:199)
    at org.springframework.boot.autoconfigure.batch.JobLauncherApplicationRunner.executeLocalJobs(JobLauncherApplicationRunner.java:173)
    at org.springframework.boot.autoconfigure.batch.JobLauncherApplicationRunner.launchJobFromProperties(JobLauncherApplicationRunner.java:160)
    at org.springframework.boot.autoconfigure.batch.JobLauncherApplicationRunner.run(JobLauncherApplicationRunner.java:155)
    at org.springframework.boot.autoconfigure.batch.JobLauncherApplicationRunner.run(JobLauncherApplicationRunner.java:150)
    at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:786)
    at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:776)
    at org.springframework.boot.SpringApplication.run(SpringApplication.java:322)
    at org.springframework.boot.SpringApplication.run(SpringApplication.java:1237)
    at org.springframework.boot.SpringApplication.run(SpringApplication.java:1226)
    at io.spring.batch.configuration.MasterActualConf.main(MasterActualConf.java:188)
Caused by: java.lang.IllegalStateException: No reply topic header and no default reply topic is can be determined
    at org.springframework.integration.kafka.outbound.KafkaProducerMessageHandler.getReplyTopic(KafkaProducerMessageHandler.java:487)
    at org.springframework.integration.kafka.outbound.KafkaProducerMessageHandler.handleRequestMessage(KafkaProducerMessageHandler.java:398)
    at org.springframework.integration.handler.AbstractReplyProducingMessageHandler.handleMessageInternal(AbstractReplyProducingMessageHandler.java:134)
    at org.springframework.integration.handler.AbstractMessageHandler.handleMessage(AbstractMessageHandler.java:62)
    ... 43 common frames omitted

2020-10-16 00:27:50.513  INFO 13716 --- [           main] o.s.batch.core.step.AbstractStep         : Step: [masterStep] executed in 86ms
2020-10-16 00:27:50.526  INFO 13716 --- [           main] o.s.b.c.l.support.SimpleJobLauncher      : Job: [SimpleJob: [name=remotePartitioningJobMy]] completed with the following parameters: [{run.id=44}] and the following status: [FAILED] in 120ms
2020-10-16 00:27:50.818  INFO 13716 --- [lyContainer-C-1] org.apache.kafka.clients.Metadata        : [Consumer clientId=consumer-testPokRequestsorab-1, groupId=testPokRequestsorab] Cluster ID: ZBfa0qdHQIaIzmOVv1fiFg
2020-10-16 00:27:50.824  INFO 13716 --- [lyContainer-C-1] o.a.k.c.c.internals.AbstractCoordinator  : [Consumer clientId=consumer-testPokRequestsorab-1, groupId=testPokRequestsorab] Discovered group coordinator cdh5161-e2e-test-7.eaas.amdocs.com:9092 (id: 2147483570 rack: null)
2020-10-16 00:27:50.841  INFO 13716 --- [lyContainer-C-1] o.a.k.c.c.internals.AbstractCoordinator  : [Consumer clientId=consumer-testPokRequestsorab-1, groupId=testPokRequestsorab] (Re-)joining group
2020-10-16 00:27:54.824  INFO 13716 --- [lyContainer-C-1] o.a.k.c.c.internals.ConsumerCoordinator  : [Consumer clientId=consumer-testPokRequestsorab-1, groupId=testPokRequestsorab] Finished assignment for group at generation 45: {consumer-testPokRequestsorab-1-55bf445d-709f-423f-9e29-c42662589397=Assignment(partitions=[reply-0])}
2020-10-16 00:27:55.060  INFO 13716 --- [lyContainer-C-1] o.a.k.c.c.internals.AbstractCoordinator  : [Consumer clientId=consumer-testPokRequestsorab-1, groupId=testPokRequestsorab] Successfully joined group with generation 45
2020-10-16 00:27:55.072  INFO 13716 --- [lyContainer-C-1] o.a.k.c.c.internals.ConsumerCoordinator  : [Consumer clientId=consumer-testPokRequestsorab-1, groupId=testPokRequestsorab] Adding newly assigned partitions: reply-0
2020-10-16 00:27:55.422  INFO 13716 --- [lyContainer-C-1] o.s.k.l.KafkaMessageListenerContainer    : testPokRequestsorab: partitions assigned: [reply-0]
2020-10-16 00:27:55.672  INFO 13716 --- [lyContainer-C-1] o.a.k.c.c.internals.ConsumerCoordinator  : [Consumer clientId=consumer-testPokRequestsorab-1, groupId=testPokRequestsorab] Setting offset for partition reply-0 to the committed offset FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[cdh5161-e2e-test-1.eaas.amdocs.com:9092 (id: 63 rack: null)], epoch=absent}}

Process finished with exit code -1

在我的例子中,我必须使用两个数据库,一个用于存储库,一个作为源,为此我在类中引入了BatchConfigurer、job repository和一个数据源创建

下面是我正在使用的代码

配置和主类

package io.spring.batch.configuration;

import io.spring.batch.domain.ColumnRangePartitioner;
import org.apache.kafka.clients.producer.ProducerRecord;
import org.apache.kafka.common.header.internals.RecordHeader;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.BatchConfigurer;
import org.springframework.batch.core.configuration.annotation.DefaultBatchConfigurer;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.launch.support.RunIdIncrementer;
import org.springframework.batch.core.repository.JobRepository;
import org.springframework.batch.core.repository.support.JobRepositoryFactoryBean;
import org.springframework.batch.integration.config.annotation.EnableBatchIntegration;
import org.springframework.batch.integration.partition.RemotePartitioningMasterStepBuilderFactory;
import org.springframework.batch.support.transaction.ResourcelessTransactionManager;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.jdbc.DataSourceBuilder;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ImportResource;
import org.springframework.integration.annotation.ServiceActivator;
import org.springframework.integration.channel.DirectChannel;
import org.springframework.integration.dsl.IntegrationFlow;
import org.springframework.integration.dsl.IntegrationFlows;
import org.springframework.integration.kafka.dsl.Kafka;
import org.springframework.integration.kafka.outbound.KafkaProducerMessageHandler;
import org.springframework.integration.kafka.outbound.KafkaProducerMessageHandler;
import org.springframework.kafka.config.ConcurrentKafkaListenerContainerFactory;
import org.springframework.kafka.core.ConsumerFactory;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.kafka.core.ProducerFactory;
import org.springframework.kafka.listener.ConcurrentMessageListenerContainer;
import org.springframework.kafka.listener.ContainerProperties;
import org.springframework.kafka.listener.KafkaMessageListenerContainer;
import org.springframework.kafka.requestreply.ReplyingKafkaTemplate;
import org.springframework.kafka.requestreply.RequestReplyFuture;
import org.springframework.kafka.support.KafkaHeaders;
import javax.sql.DataSource;



@SpringBootApplication
@EnableBatchProcessing
@EnableBatchIntegration
@ImportResource("context.xml")
public class MasterActualConf {


    private final JobBuilderFactory jobBuilderFactory;

    private final RemotePartitioningMasterStepBuilderFactory masterStepBuilderFactory;

    private static final int GRID_SIZE = 4;

    @Bean
    public DirectChannel replies() {
        return new DirectChannel();
    }

    @Bean
    public DirectChannel requests() {return new DirectChannel(); }

    @Autowired
    private KafkaTemplate<String, String> kafkaTemplate;

    @Bean
    public Step masterStep() {
        System.out.println("*******************  inside  masterStep **************************");
        return this.masterStepBuilderFactory.get("masterStep")
                .partitioner("workerStep", new ColumnRangePartitioner())
                .gridSize(GRID_SIZE)
                .outputChannel(requests())
                .inputChannel(replies())
                .build();
    }

    @Bean
    public Job remotePartitioningJob() {
        System.out.println("*******************  inside  remotePartitioningJob **************************");
        return this.jobBuilderFactory.get("remotePartitioningJobMy")
                .incrementer(new RunIdIncrementer())
                .start(masterStep())
                .build();
    }

    protected JobRepository createMyJobRepository() throws Exception {
        JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
        factory.setTransactionManager(new ResourcelessTransactionManager());
        factory.setDataSource(createDataSourceForRepository());
        factory.setDatabaseType("HSQL");
        return factory.getObject();
    }


    public DataSource createDataSourceForRepository() {
        return DataSourceBuilder.create()
                .url("jdbc:hsqldb:file:src/main/resources/hsqldb/batchcore.db;shutdown=true;")
                .driverClassName("org.hsqldb.jdbcDriver")
                .username("sa")
                .password("")
                .build();
    }


    @Bean
    public BatchConfigurer batchConfigurer() {
        return new DefaultBatchConfigurer(createDataSourceForRepository()) {
            @Override
            public JobRepository getJobRepository() {
                JobRepository jobRepository = null;
                try {
                    jobRepository = createMyJobRepository();

                } catch (Exception e) {
                    e.printStackTrace();
                }
                System.out.println("************************Inside  batchConfigurer    ****************");
                return jobRepository;
            }
        };
    }


    @Bean
    public KafkaMessageListenerContainer<String, String> replyContainer(ConsumerFactory<String, String> cf) {
        ContainerProperties containerProperties = new ContainerProperties("reply");
        System.out.println("************************** replyContainer  *****************************");
        return new KafkaMessageListenerContainer<>(cf, containerProperties);
    }


    @Bean
    public ReplyingKafkaTemplate<String, String, String> replyingTemplate(ProducerFactory<String, String> producerFactory,KafkaMessageListenerContainer<String, String> repliesContainer) {
        System.out.println("**************************replyingTemplate Templet  *****************************");
        return new ReplyingKafkaTemplate<>(producerFactory, repliesContainer);
    }


    @Bean
    public IntegrationFlow outboundGateFlow( ReplyingKafkaTemplate<String , String, String> kafkaTemplate) {
        kafkaTemplate.setDefaultTopic("reply");
        return IntegrationFlows.from(requests())
                .handle(Kafka.outboundGateway(kafkaTemplate).topic("requests").partitionId(0))
                .channel("requests")
                .get();
    }


      /*
    @ServiceActivator(inputChannel = "requests", outputChannel = "reply")
    public KafkaProducerMessageHandler<String, String> outGateway(ReplyingKafkaTemplate<String, String, String> kafkaTemplate) {
        ProducerRecord<String, String> record = new ProducerRecord<>("kRequests", "foo");
        record.headers().add(new RecordHeader(KafkaHeaders.REPLY_TOPIC, "kReplies".getBytes()));
        RequestReplyFuture<String, String, String> replyFuture = kafkaTemplate.sendAndReceive(record);
        System.out.println("****************inside gateway **************************");
        //SendResult<String, String> sendResult = replyFuture.getSendFuture().get();
        //System.out.println("Sent ok: " + sendResult.getRecordMetadata());
        //ConsumerRecord<String, String> consumerRecord = replyFuture.get();
        //System.out.println("Return value: " + consumerRecord.value());
        return new KafkaProducerMessageHandler<>(kafkaTemplate);
    }

 */
    @Bean
    public ConcurrentMessageListenerContainer<String, String> repliesContainer(ConcurrentKafkaListenerContainerFactory<String, String> containerFactory) {
        ConcurrentMessageListenerContainer<String, String> repliesContainer = containerFactory.createContainer("reply");
        repliesContainer.getContainerProperties().setGroupId("repliesGroup");
        System.out.println("**************** reply topic ki khoj " + repliesContainer.getContainerProperties().getTopics()[0] + "*****************************");
        repliesContainer.setAutoStartup(false);
        return repliesContainer;
    }

    @Bean
    public ReplyingKafkaTemplate<String, String, String> kafkaTemplate(
            ProducerFactory<String, String> pf, KafkaMessageListenerContainer<String, String> replyContainer) {
        return new ReplyingKafkaTemplate<>(pf, replyContainer);
    }

    public MasterActualConf(JobBuilderFactory jobBuilderFactory, RemotePartitioningMasterStepBuilderFactory masterStepBuilderFactory) {
        this.jobBuilderFactory = jobBuilderFactory;
        this.masterStepBuilderFactory = masterStepBuilderFactory;
    }

    public static void main(String[] args) {
        SpringApplication.run(MasterActualConf.class, args);
    }

}

包io.spring.batch.configuration;
导入io.spring.batch.domain.ColumnRangePartitioner;
导入org.apache.kafka.clients.producer.ProducerRecord;
导入org.apache.kafka.common.header.internals.RecordHeader;
导入org.springframework.batch.core.Job;
导入org.springframework.batch.core.Step;
导入org.springframework.batch.core.configuration.annotation.BatchConfigurer;
导入org.springframework.batch.core.configuration.annotation.DefaultBatchConfigurer;
导入org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
导入org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
导入org.springframework.batch.core.launch.support.RunIdIncrementer;
导入org.springframework.batch.core.repository.JobRepository;
导入org.springframework.batch.core.repository.support.JobRepositoryFactoryBean;
导入org.springframework.batch.integration.config.annotation.EnableBatchIntegration;
导入org.springframework.batch.integration.partition.RemotePartitioningMasterStepBuilderFactory;
导入org.springframework.batch.support.transaction.ResourcesTranslationManager;
导入org.springframework.beans.factory.annotation.Autowired;
导入org.springframework.boot.SpringApplication;
导入org.springframework.boot.autoconfigure.springboot应用程序;
导入org.springframework.boot.jdbc.DataSourceBuilder;
导入org.springframework.context.annotation.Bean;
导入org.springframework.context.annotation.ImportResource;
导入org.springframework.integration.annotation.ServiceActivator;
导入org.springframework.integration.channel.DirectChannel;
导入org.springframework.integration.dsl.IntegrationFlow;
导入org.springframework.integration.dsl.IntegrationFlows;
导入org.springframework.integration.kafka.dsl.kafka;
导入org.springframework.integration.kafka.outbound.KafkaProducerMessageHandler;
导入org.springframework.integration.kafka.outbound.KafkaProducerMessageHandler;
导入org.springframework.kafka.config.ConcurrentKafkaListenerContainerFactory;
导入org.springframework.kafka.core.ConsumerFactory;
导入org.springframework.kafka.core.KafkaTemplate;
导入org.springframework.kafka.core.ProducerFactory;
导入org.springframework.kafka.listener.ConcurrentMessageListenerContainer;
导入org.springframework.kafka.listener.ContainerProperties;
导入org.springframework.kafka.listener.KafkaMessageListenerContainer;
导入org.springframework.kafka.requestreply.replingkafkatemplate;
导入org.springframework.kafka.requestreply.RequestReplyFuture;
导入org.springframework.kafka.support.KafkaHeaders;
导入javax.sql.DataSource;
@SpringBoot应用程序
@启用批处理
@启用批处理集成
@ImportResource(“context.xml”)
公共类MasterActualConf{
私人最终JobBuilderFactory JobBuilderFactory;
私有最终远程分区masterStepBuilderFactory masterStepBuilderFactory;
专用静态最终整型网格大小=4;
@豆子
公共DirectChannel回复(){
返回新的DirectChannel();
}
@豆子
公共DirectChannel请求(){返回新的DirectChannel();}
@自动连线
私人卡夫卡模板卡夫卡模板;
@豆子
公共步骤masterStep(){
System.out.println(“****************************主步骤内*******************************”);
返回此.masterStepBuilderFactory.get(“masterStep”)
.partitioner(“workerStep”,新的ColumnRangePartitioner())
.gridSize(网格大小)
.outputChannel(请求())
.inputChannel(回复())
.build();
}
@豆子
公共作业remotePartitioningJob(){
System.out.println(“*************************内部远程分区作业*******************************”);
返回此.jobBuilderFactory.get(“remotePartitioningJobMy”)
.incrementer(新的RunIdIncrementer())
.start(masterStep())
.build();
}
受保护的JobRepository createMyJobRepository()引发异常{
JobRepositoryFactoryBean工厂=新的JobRepositoryFactoryBean();
setTransactionManager(新的ResourcesTransactionManager());
setDataSource(createDataSourceForepository());
setDatabaseType(“HSQL”);
返回factory.getObject();
}
公共数据源CreateDataSourceForepository(){
返回DataSourceBuilder.create()
.url(“jdbc:hsqldb:file:src/main/resources/hsqldb/batchcore.db;shutdown=true;”)
.driverClassName(“org.hsqldb.jdbcDriver”)
.用户名(“sa”)
.密码(“”)
.build();
}
@豆子
公共批处理配置器批处理配置器(){
返回新的DefaultBatchConfigurer(CreateDataSourceForepository()){
@凌驾
公共作业存储库getJobRepository(){
JobRepository JobRepository=null;
试一试{
jobRepository=createMyJobRepository();
}捕获(例外e){
e、 printStackTrace();
}
System.out.println(“**********************************内部批处理配置程序**********************”);
返回作业库;
}
};
}
@豆子
公共卡夫卡姆信使容器
import java.util.Map;

public class BasicPartitioner extends SimplePartitioner {

    private static final String PARTITION_KEY = "partition";

    @Override
    public Map<String, ExecutionContext> partition(int gridSize) {
        Map<String, ExecutionContext> partitions = super.partition(gridSize);
        int i = 0;
        for (ExecutionContext context : partitions.values()) {
            context.put(PARTITION_KEY, PARTITION_KEY + (i++));
        }
        System.out.println("#########################Inside Basic Partitioner  ##################   ");
        return partitions;
    }

}
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <parent>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-parent</artifactId>
        <version>2.3.1.RELEASE</version>
        <relativePath/> <!-- lookup parent from repository -->
    </parent>


    <groupId>RemotePartitioningTest</groupId>
    <artifactId>RemotePartitioningTest</artifactId>
    <version>1.0-SNAPSHOT</version>

    <properties>
        <java.version>1.8</java.version>
        <spring-cloud.version>Hoxton.SR6</spring-cloud.version>
    </properties>

    <dependencies>
        <!-- https://mvnrepository.com/artifact/org.springframework.cloud.stream.app/spring-cloud-starter-stream-sink-task-launcher-local -->
        <dependency>
            <groupId>org.springframework.cloud.stream.app</groupId>
            <artifactId>spring-cloud-starter-stream-sink-task-launcher-local</artifactId>
            <version>1.2.0.RELEASE</version>
        </dependency>

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-data-jpa</artifactId>
            <version>2.2.6.RELEASE</version>
        </dependency>

        <dependency>
            <groupId>org.springframework.cloud</groupId>
            <artifactId>spring-cloud-task-core</artifactId>
        </dependency>

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-amqp</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.cloud</groupId>
            <artifactId>spring-cloud-stream</artifactId>
        </dependency>

        <dependency>
            <groupId>org.springframework.cloud</groupId>
            <artifactId>spring-cloud-stream-binder-kafka</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.kafka</groupId>
            <artifactId>spring-kafka</artifactId>
        </dependency>

        <dependency>
            <groupId>org.springframework.kafka</groupId>
            <artifactId>spring-kafka-test</artifactId>
            <scope>test</scope>
        </dependency>

        <dependency>
            <groupId>org.springframework.integration</groupId>
            <artifactId>spring-integration-file</artifactId>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.springframework.integration/spring-integration-stream -->
        <dependency>
            <groupId>org.springframework.integration</groupId>
            <artifactId>spring-integration-stream</artifactId>
        </dependency>

        <dependency>
            <groupId>org.springframework.integration</groupId>
            <artifactId>spring-integration-core</artifactId>
        </dependency>

        <dependency>
            <groupId>org.springframework.integration</groupId>
            <artifactId>spring-integration-zookeeper</artifactId>
            <version>5.3.2.RELEASE</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.springframework.integration/spring-integration-amqp -->
        <dependency>
            <groupId>org.springframework.integration</groupId>
            <artifactId>spring-integration-amqp</artifactId>
            <version>5.3.2.RELEASE</version>
        </dependency>

        <dependency>
            <groupId>org.springframework.cloud</groupId>
            <artifactId>spring-cloud-stream-test-support</artifactId>
            <scope>test</scope>
        </dependency>

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-batch</artifactId>
        </dependency>

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <scope>test</scope>
        </dependency>


        <dependency>
            <groupId>org.springframework.cloud</groupId>
            <artifactId>spring-cloud-starter-task</artifactId>
        </dependency>
        <!-- https://mvnrepository.com/artifact/commons-dbcp/commons-dbcp -->
        <dependency>
            <groupId>org.hsqldb</groupId>
            <artifactId>hsqldb</artifactId>
        </dependency>


        <dependency>
            <groupId>mysql</groupId>
            <artifactId>mysql-connector-java</artifactId>
            <version>5.0.4</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/commons-dbcp/commons-dbcp -->
        <dependency>
            <groupId>commons-dbcp</groupId>
            <artifactId>commons-dbcp</artifactId>
            <version>1.2.2</version>
        </dependency>


        <dependency>
            <groupId>org.springframework.batch</groupId>
            <artifactId>spring-batch-integration</artifactId>
        </dependency>


    </dependencies>

    <dependencyManagement>
        <dependencies>
            <dependency>
                <groupId>org.springframework.cloud</groupId>
                <artifactId>spring-cloud-dependencies</artifactId>
                <version>${spring-cloud.version}</version>
                <type>pom</type>
                <scope>import</scope>
            </dependency>
        </dependencies>
    </dependencyManagement>

    <build>
        <plugins>
            <plugin>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-maven-plugin</artifactId>
            </plugin>
        </plugins>
    </build>

</project>
ContainerProperties containerProperties = new ContainerProperties(
    new TopicPartitionOffset("reply", 0));