Java 如何正确创建集成测试的使用者,以验证消息是否已推送到kafka?

Java 如何正确创建集成测试的使用者,以验证消息是否已推送到kafka?,java,spring-boot,apache-kafka,spring-cloud,spring-cloud-stream,Java,Spring Boot,Apache Kafka,Spring Cloud,Spring Cloud Stream,我使用avro生成一个java类(Heartbeat),并使用Spring云消息处理器使用这个Heartbeat类将消息推送到kafka 这就是我的服务: @Service public class HeartbeatServiceImpl implements HeartbeatService { private Processor processor; public HeartbeatServiceImpl(Processor processor) { this.proc

我使用avro生成一个java类(Heartbeat),并使用Spring云消息处理器使用这个Heartbeat类将消息推送到kafka

这就是我的服务:

@Service
public class HeartbeatServiceImpl implements HeartbeatService {

  private Processor processor;

  public HeartbeatServiceImpl(Processor processor) {
    this.processor = processor;
  }

  @Override
  public boolean sendHeartbeat(Heartbeat heartbeat) {
    Message<Heartbeat> message =
        MessageBuilder.withPayload(heartbeat).setHeader(KafkaHeaders.MESSAGE_KEY, "MY_KEY").build();
    return processor.output().send(message);
  }

}
我可以通过运行ksql看到消息确实到达了kafka。因此,信息正如预期的那样存在。 我还收到来自我的heartbeatKafkaConsumer的消息,但当我做断言时,我得到了以下错误:

Expecting:
 <"ConsumerRecord(topic = HEARTBEAT, partition = 0, leaderEpoch = 0, offset = 4, CreateTime = 1622134829899, serialized key size = 12, serialized value size = 75, headers = RecordHeaders(headers = [], isReadOnly = false), key = [B@765aa560, value = [B@3582e1cd)">
to be equal to:
 <{"ID": "my-Test ID", "TYPE": "null", "MSG_ID": "my-Test 123", "MSG_TIME": 12345, "RECEIVED_TIME": 12345, "INPUT_SOURCE": "my-Test IS", "SBK_FEED_PROVIDER_ID": "null", "SBK_FEED_PROVIDER_NAME": "null"}>

使用
consumerRecord.value()
而不是toString

它将是一个
字节[]
,您可以将其传递到ObjectMapper,以将其反序列化为
心跳信号

或者,只需将使用者配置为使用
JsonDeserializer
consumerRecord.value()
将是心跳信号。您需要配置反序列化程序来告诉它要创建哪种类型

第三个(也是最简单的)选项是添加一个
JsonMessageConverter
@Bean
(boot将把它连接到侦听器容器中),并将您的方法更改为

@KafkaListener(topics=“HEARTBEAT”)
公共无效接收(心跳){
...
}
框架根据方法签名告诉转换器要创建什么类型

public class HeartbeatServiceImplIntegrationTest {

  @Autowired
  private HeartbeatServiceImpl heartbeatService;

  @Autowired
  private HeartbeatKafkaConsumer heartbeatKafkaConsumer;

  @Test
  public void assertHeartbeatPushedToKafka() throws InterruptedException {
    Heartbeat heartbeat =
        Heartbeat.newBuilder().setID("my-Test ID").setINPUTSOURCE("my-Test IS")
            .setMSGID("my-Test 123").setMSGTIME(12345l).setRECEIVEDTIME(12345l).build();

    boolean isMessageSent = heartbeatService.sendHeartbeat(heartbeat);
    assertThat(isMessageSent).isTrue();

    heartbeatKafkaConsumer.getLatch().await(10000, TimeUnit.MILLISECONDS);

    assertThat(heartbeatKafkaConsumer.getLatch().getCount()).isEqualTo(0L);
    assertThat(heartbeatKafkaConsumer.getPayload()).isEqualTo(heartbeat);

  }

}
Expecting:
 <"ConsumerRecord(topic = HEARTBEAT, partition = 0, leaderEpoch = 0, offset = 4, CreateTime = 1622134829899, serialized key size = 12, serialized value size = 75, headers = RecordHeaders(headers = [], isReadOnly = false), key = [B@765aa560, value = [B@3582e1cd)">
to be equal to:
 <{"ID": "my-Test ID", "TYPE": "null", "MSG_ID": "my-Test 123", "MSG_TIME": 12345, "RECEIVED_TIME": 12345, "INPUT_SOURCE": "my-Test IS", "SBK_FEED_PROVIDER_ID": "null", "SBK_FEED_PROVIDER_NAME": "null"}>
spring.cloud.stream.default.producer.useNativeEncoding=true
spring.cloud.stream.default.consumer.useNativeEncoding=true
spring.cloud.stream.bindings.input.destination=HEARTBEAT
spring.cloud.stream.bindings.input.content-type=application/*+avro
spring.cloud.stream.bindings.output.destination=HEARTBEAT
spring.cloud.stream.bindings.output.content-type=application/*+avro
spring.cloud.stream.kafka.binder.producer-properties.schema.registry.url=http://localhost:8081
spring.cloud.stream.kafka.binder.producer-properties.key.serializer=io.confluent.kafka.serializers.KafkaAvroSerializer
spring.cloud.stream.kafka.binder.producer-properties.value.serializer=io.confluent.kafka.serializers.KafkaAvroSerializer
spring.cloud.stream.kafka.binder.consumer-properties.schema.registry.url=http://localhost:8081
spring.cloud.stream.kafka.binder.consumer-properties.key.serializer=io.confluent.kafka.serializers.KafkaAvroDeserializer
spring.cloud.stream.kafka.binder.consumer-properties.value.serializer=io.confluent.kafka.serializers.KafkaAvroDeserializer
spring.cloud.stream.kafka.binder.consumer-properties.specific.avro.reader=true
spring.kafka.bootstrap-servers=127.0.0.1:9092
spring.kafka.consumer.group-id=myclient
spring.kafka.consumer.auto-offset-reset=earliest