Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Spring integration spring kafka集成动态创建消费者_Spring Integration_Apache Kafka - Fatal编程技术网

Spring integration spring kafka集成动态创建消费者

Spring integration spring kafka集成动态创建消费者,spring-integration,apache-kafka,Spring Integration,Apache Kafka,我正在使用SpringIntegrationKafka,下面是一个示例,用于动态创建使用者,以便在控制台中接收和打印消息。 消费者类别: public class Consumer1 { private static final String CONFIG = "kafkaInboundMDCAdapterParserTests-context.xml"; static ClassPathXmlApplicationContext ctx; public static void main(fi

我正在使用SpringIntegrationKafka,下面是一个示例,用于动态创建使用者,以便在控制台中接收和打印消息。 消费者类别:

public class Consumer1 {
private static final String CONFIG = "kafkaInboundMDCAdapterParserTests-context.xml";
static ClassPathXmlApplicationContext ctx;

public static void main(final String args[]) {
    ctx = new ClassPathXmlApplicationContext(CONFIG, Consumer1.class);
    ctx.start();
    addConsumer("test19", "default8");

    ctx = new ClassPathXmlApplicationContext(CONFIG, Consumer1.class);
    ctx.start();
    addConsumer("test19", "default10");

}

public static void addConsumer(String topicId, String groupId) {

    MessageChannel inputChannel = ctx.getBean("inputFromKafka", MessageChannel.class);

    ServiceActivatingHandler serviceActivator = new ServiceActivatingHandler(new MessageReceiver(), "processMessage");
    ((SubscribableChannel) inputChannel).subscribe(serviceActivator);

    KafkaConsumerContext<String, String> kafkaConsumerContext = ctx.getBean("consumerContext", KafkaConsumerContext.class);
    try {
        TopicFilterConfiguration topicFilterConfiguration = new TopicFilterConfiguration(topicId, 1, false);

        ConsumerMetadata<String,String> consumerMetadata = new ConsumerMetadata<String, String>();
        consumerMetadata.setGroupId(groupId);
        consumerMetadata.setTopicFilterConfiguration(topicFilterConfiguration);
        consumerMetadata.setConsumerTimeout("1000");
        consumerMetadata.setKeyDecoder(new AvroReflectDatumBackedKafkaDecoder<String>(java.lang.String.class));
        consumerMetadata.setValueDecoder(new AvroReflectDatumBackedKafkaDecoder<String>(java.lang.String.class));


        ZookeeperConnect zkConnect = ctx.getBean("zookeeperConnect", ZookeeperConnect.class);

        ConsumerConfigFactoryBean<String, String> consumer = new ConsumerConfigFactoryBean<String, String>(consumerMetadata,
                zkConnect);

        ConsumerConnectionProvider consumerConnectionProvider = new ConsumerConnectionProvider(consumer.getObject());
        MessageLeftOverTracker<String,String> messageLeftOverTracker = new MessageLeftOverTracker<String, String>();
        ConsumerConfiguration<String, String> consumerConfiguration = new ConsumerConfiguration<String, String>(consumerMetadata, consumerConnectionProvider, messageLeftOverTracker);

        kafkaConsumerContext.getConsumerConfigurations().put(groupId, consumerConfiguration);
    } catch (Exception exp) {
        exp.printStackTrace();
    }
}
公共类消费者1{
私有静态最终字符串CONFIG=“kafkaInboundMDCAdapterParserTests context.xml”;
静态类路径XmlApplicationContext ctx;
公共静态void main(最终字符串参数[]){
ctx=新的ClassPathXmlApplicationContext(配置,Consumer1.class);
ctx.start();
addConsumer(“test19”、“default8”);
ctx=新的ClassPathXmlApplicationContext(配置,Consumer1.class);
ctx.start();
addConsumer(“test19”、“default10”);
}
公共静态void addConsumer(字符串topicId、字符串groupId){
MessageChannel inputChannel=ctx.getBean(“inputFromKafka”,MessageChannel.class);
ServiceActivationHandler serviceActivator=new ServiceActivationHandler(new MessageReceiver(),“processMessage”);
((SubscribeAbleChannel)inputChannel).subscribe(serviceActivator);
KafkaConsumerContext KafkaConsumerContext=ctx.getBean(“consumerContext”,KafkaConsumerContext.class);
试一试{
TopicFilterConfiguration TopicFilterConfiguration=新的TopicFilterConfiguration(topicId,1,false);
ConsumerMetadata ConsumerMetadata=新ConsumerMetadata();
consumerMetadata.setGroupId(groupId);
consumerMetadata.setTopicFilterConfiguration(topicFilterConfiguration);
consumerMetadata.setConsumerTimeout(“1000”);
setKeyDecoder(新的AvroReflectDatumBackedKafkaDecoder(java.lang.String.class));
setValueDecoder(新的AvroReflectDatumBackedKafkaDecoder(java.lang.String.class));
ZookeeperConnect zkConnect=ctx.getBean(“ZookeeperConnect”,ZookeeperConnect.class);
ConsumerConfigFactoryBean consumer=新的ConsumerConfigFactoryBean(consumerMetadata,
zkConnect);
ConsumerConnectionProvider ConsumerConnectionProvider=新的ConsumerConnectionProvider(consumer.getObject());
MessageLeftOverTracker MessageLeftOverTracker=newmessageleftovertracker();
ConsumerConfiguration ConsumerConfiguration=新的ConsumerConfiguration(consumerMetadata、consumerConnectionProvider、messageLeftOverTracker);
kafkaConsumerContext.getConsumerConfiguration().put(groupId,consumerConfiguration);
}捕获(异常扩展){
exp.printStackTrace();
}
}
}

入站配置文件:

<int:channel id="inputFromKafka"/>

<int-kafka:zookeeper-connect id="zookeeperConnect" zk-connect="localhost:2181"
        zk-connection-timeout="6000"
        zk-session-timeout="6000"
        zk-sync-time="2000"/>

<int-kafka:inbound-channel-adapter id="kafkaInboundChannelAdapter"
        kafka-consumer-context-ref="consumerContext"
        auto-startup="false"
        channel="inputFromKafka">
    <int:poller fixed-delay="1" time-unit="MILLISECONDS"/>
</int-kafka:inbound-channel-adapter>

<bean id="kafkaReflectionDecoder" class="org.springframework.integration.kafka.serializer.avro.AvroReflectDatumBackedKafkaDecoder">
    <constructor-arg type="java.lang.Class" value="java.lang.String"/>
</bean>

<int-kafka:consumer-context id="consumerContext"
        consumer-timeout="1000"
        zookeeper-connect="zookeeperConnect">
    <int-kafka:consumer-configurations>
        <int-kafka:consumer-configuration group-id="default1"
                value-decoder="kafkaReflectionDecoder"
                key-decoder="kafkaReflectionDecoder"
                max-messages="5000">
            <int-kafka:topic id="mdc1" streams="1"/>
        </int-kafka:consumer-configuration>
    </int-kafka:consumer-configurations>
</int-kafka:consumer-context>

当我将任何消息发送到主题“test19”时,configured ServiceActivator“processMessage”方法将两条消息显示为已配置的两个客户,但这里的问题是,在添加到消费者上下文之前,我需要为每个客户加载入站配置文件。。否则,我在控制台中只收到一条消息。。这是正确的方式还是我需要在这里改变什么


谢谢。

现在还不清楚你想做什么,但你确实有问题

通过在订阅您的消费者之前启动上下文,您可能会遇到问题(Dispatcher在开始和订阅之间的短时间内没有订阅服务器在
inputFromKafka

为什么要以编程方式创建服务激活器,而不是在上下文中声明它


最好在上下文中配置所有内容(您可以通过环境中的属性并使用属性占位符配置器将属性(如
groupId
)传递到上下文。

Hello Gary,根据我的要求,我正在尝试减少上下文中的配置数量,我想要的是如何动态创建使用者并将服务激活器消息分发给al。)l消费者。。