Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/redis/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
使用ktor和kotlin对kafka消费者进行na抽象_Kotlin_Apache Kafka_Ktor - Fatal编程技术网

使用ktor和kotlin对kafka消费者进行na抽象

使用ktor和kotlin对kafka消费者进行na抽象,kotlin,apache-kafka,ktor,Kotlin,Apache Kafka,Ktor,我正在为消费者和制作人卡夫卡创建一个抽象,以避免代码一直重复。因此,我使用kotlin和gradle创建了一个名为kafka commons的库,并输入以下代码: 卡夫卡制作人: fun producer( bootstrapServers: String, idempotence: Boolean, acks: Acks, retries: Int, requestPerConnection: Int, compression: Compress

我正在为消费者和制作人卡夫卡创建一个抽象,以避免代码一直重复。因此,我使用kotlin和gradle创建了一个名为kafka commons的库,并输入以下代码: 卡夫卡制作人:

fun producer(
    bootstrapServers: String,
    idempotence: Boolean,
    acks: Acks,
    retries: Int,
    requestPerConnection: Int,
    compression: Compression,
    linger: Int,
    batchSize: BatchSize
): KafkaProducer<String, Any> {
    val prop: HashMap<String, Any> = HashMap()
    prop[BOOTSTRAP_SERVERS_CONFIG] = bootstrapServers
    prop[KEY_SERIALIZER_CLASS_CONFIG] = StringSerializer::class.java.name
    prop[VALUE_SERIALIZER_CLASS_CONFIG] = StringSerializer::class.java.name
    prop[ENABLE_IDEMPOTENCE_CONFIG] = idempotence
    prop[ACKS_CONFIG] = acks.value
    prop[RETRIES_CONFIG] = retries
    prop[MAX_IN_FLIGHT_REQUESTS_PER_CONNECTION] = requestPerConnection
    prop[COMPRESSION_TYPE_CONFIG] = compression.value
    prop[LINGER_MS_CONFIG] = linger
    prop[BATCH_SIZE_CONFIG] = batchSize.value

    return KafkaProducer(prop)
}

suspend inline fun <reified K : Any, reified V : Any> KafkaProducer<K, V>.dispatch(record: ProducerRecord<K, V>) =
    suspendCoroutine<RecordMetadata> { continuation ->
        val callback = Callback { metadata, exception ->
            if (metadata == null) {
                continuation.resumeWithException(exception!!)
            } else {
                continuation.resume(metadata)
            }
        }
        this.send(record, callback)
    }
id:创建一个唯一的id 状态:消息状态可以是:打开/处理/关闭/错误 消息:来自http请求的对象,例如:If have a Post

例如:如果有插入用户:带正文的帖子:

{id:1,姓名:约翰,姓氏:维克}

所以消息将是这个对象,依此类推

为了创建这个命令,我做了这个函数:

suspend fun creatCommand(
    topicName: String,
    id: UUID,
    commandStatus: CommandStatus,
    request: Any,
    bootstrapServers: String,
    idempotence: Boolean,
    acks: Acks,
    retries: Int,
    requestPerConnection: Int,
    compression: Compression,
    linger: Int,
    batchSize: BatchSize
): Unit {
    val producer = producer(
        bootstrapServers,
        idempotence,
        acks,
        retries,
        requestPerConnection,
        compression,
        linger,
        batchSize)

    val command = toCommand(processStarted(id, commandStatus, request))
    val record = ProducerRecord<String, Any>(topicName, id.toString(), command)
    coroutineScope { launch { producer.dispatch(record) } }
}

有人可以帮我吗?

如果我理解正确,您在将JSON映射到对象时遇到了问题

   val mapper = ObjectMaper
   while (true) {
        val records = consumer.poll(Duration.ofMillis(100))
        for (record in records) {
            val cmd = mapper.readvalue(record.value(), command::class.java)
            // do things with cmd
        }
    }
注意:Kafka有自己的JSON到POJO反序列化程序,如果您想将数据发送到数据库,Kafka Connect通常比简单的消费循环更具容错性

fun Route.user(service: Service) =
    route("/api/access") {
        post("/test") {
            call.respond(service.command(call.receive())) 
            }
}

>>>>>> other class <<<<<<<<

classService () {
    fun command( all parameters) { creatCommand(all parameters)} 
}
fun consumer(
    bootstrapServers: String,
    group: String,
    autoCommit: Boolean,
    offsetBehaviour: OffsetBehaviour,
    pollMax: Int
): KafkaConsumer<String, Any> {
    val prop: HashMap<String, Any> = HashMap()
    prop[BOOTSTRAP_SERVERS_CONFIG] = bootstrapServers
    prop[KEY_DESERIALIZER_CLASS_CONFIG] = StringDeserializer::class.java.name
    prop[VALUE_DESERIALIZER_CLASS_CONFIG] = StringDeserializer::class.java.name
    prop[GROUP_ID_CONFIG] = group
    prop[AUTO_OFFSET_RESET_CONFIG] = offsetBehaviour
    prop[ENABLE_AUTO_COMMIT_CONFIG] = autoCommit
    prop[MAX_POLL_RECORDS_CONFIG] = pollMax

    return KafkaConsumer(prop)
}
fun<T> recordingCommand(
    command: Class<T>,
    topic: String,
    bootstrapServers: String,
    group: String,
    autoCommit: Boolean,
    offsetBehaviour: OffsetBehaviour,
    pollMax: Int
) {
    val consumer = consumer(bootstrapServers, group, autoCommit, offsetBehaviour, pollMax)
    consumer.subscribe(mutableListOf(topic))
    while (true) {
        val records = consumer.poll(Duration.ofMillis(100))
        for (record in records) {
            val om = ObjectMaper
            om.readvalue(record.value(), command::class.java)

            >>>> I GOT LOST HERE <<<<<<

        }
    }
}
service.insert(recordingCommand(all parameters)).
   val mapper = ObjectMaper
   while (true) {
        val records = consumer.poll(Duration.ofMillis(100))
        for (record in records) {
            val cmd = mapper.readvalue(record.value(), command::class.java)
            // do things with cmd
        }
    }