Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache kafka 使用Kafka rest API读取数据的JSONParseException_Apache Kafka_Kafka Rest - Fatal编程技术网

Apache kafka 使用Kafka rest API读取数据的JSONParseException

Apache kafka 使用Kafka rest API读取数据的JSONParseException,apache-kafka,kafka-rest,Apache Kafka,Kafka Rest,卡夫卡主题(测试3) 消费者(本地主机上的卡夫卡rest API:8082) 创建消费者POST请求至http://localhost:8082/consumers/rested 请求机构: { "format": "json", "auto.offset.reset": "earliest", "auto.commit.enable": "false" } { "topics": [ "test3" ] } 答复机构: { "inst

卡夫卡主题(测试3)

消费者(本地主机上的卡夫卡rest API:8082)

  • 创建消费者
    POST
    请求至
    http://localhost:8082/consumers/rested
  • 请求机构:

     {
       "format": "json",
       "auto.offset.reset": "earliest",
       "auto.commit.enable": "false"
     }
    
    {
        "topics": [
          "test3"
        ]
    }
    
    答复机构:

    {
       "instance_id": "rest-consumer-dfa6ee0e-4f24-46dc-b0dc-dda3b80866ff",
    
       "base_uri": "http://rest-proxy:8082/consumers/rested/instances/rest-consumer-dfa6ee0e-4f24-46dc-b0dc-dda3b80866ff"
    
    }
    
  • 使用hg
    POST
    to
    http://localhost:8082/consumers/rested/instances/rest-消费者-dfa6ee0e-4f24-46dc-b0dc-dda3b80866ff/订阅
  • 使用标题:

    Host: http://localhost:8082
    Content-Type: application/vnd.kafka.v2+json
    
    和请求机构:

     {
       "format": "json",
       "auto.offset.reset": "earliest",
       "auto.commit.enable": "false"
     }
    
    {
        "topics": [
          "test3"
        ]
    }
    
    返回
    204无内容的响应

  • 通过请求
    GET
    读取记录http://localhost:8082/consumers/rested/instances/rest-消费者-dfa6ee0e-4f24-46dc-b0dc-dda3b80866ff/记录
  • 使用标题:

    Host: http://localhost:8082
    Accept: application/vnd.kafka.json.v2+json
    
    返回响应:

    {
        "error_code": 50002,
        "message": "Kafka error: com.fasterxml.jackson.core.JsonParseException: Unrecognized token 'key': was expecting ('true', 'false' or 'null')\n at [Source: (byte[])\"key\"; line: 1, column: 7]"
    }
    
    我们如何解决此问题并确保收到数据

    例外(关于卡夫卡)

    正在运行的Kafka Rest代理服务器日志存在以下异常:

    rest-proxy         | [2018-12-31 03:09:27,232] INFO 172.25.0.1 - - [31/Dec/2018:03:09:26 +0000] "GET /consumers/rest-consumer/instances/rest-consumer-8e49873e-13ce-46a5-be1f-0237a0369efe/records HTTP/1.1" 500 211  341 (io.confluent.rest-utils.requests)
    rest-proxy         | [2018-12-31 03:09:27,235] ERROR Unexpected exception in consumer read task id=io.confluent.kafkarest.v2.KafkaConsumerReadTask@59611e28  (io.confluent.kafkarest.v2.KafkaConsumerReadTask)
    rest-proxy         | org.apache.kafka.common.errors.SerializationException: com.fasterxml.jackson.core.JsonParseException: Unrecognized token 'key': was expecting ('true', 'false' or 'null')
    rest-proxy         |  at [Source: (byte[])"key"; line: 1, column: 7]
    rest-proxy         | Caused by: com.fasterxml.jackson.core.JsonParseException: Unrecognized token 'key': was expecting ('true', 'false' or 'null')
    rest-proxy         |  at [Source: (byte[])"key"; line: 1, column: 7]
    rest-proxy         |    at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1804)
    rest-proxy         |    at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:679)
    rest-proxy         |    at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._reportInvalidToken(UTF8StreamJsonParser.java:3526)
    rest-proxy         |    at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._handleUnexpectedValue(UTF8StreamJsonParser.java:2621)
    rest-proxy         |    at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._nextTokenNotInObject(UTF8StreamJsonParser.java:826)
    rest-proxy         |    at com.fasterxml.jackson.core.json.UTF8StreamJsonParser.nextToken(UTF8StreamJsonParser.java:723)
    rest-proxy         |    at com.fasterxml.jackson.databind.ObjectMapper._initForReading(ObjectMapper.java:4141)
    rest-proxy         |    at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4000)
    rest-proxy         |    at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3091)
    rest-proxy         |    at io.confluent.kafkarest.v2.JsonKafkaConsumerState.deserialize(JsonKafkaConsumerState.java:79)
    rest-proxy         |    at io.confluent.kafkarest.v2.JsonKafkaConsumerState.createConsumerRecord(JsonKafkaConsumerState.java:64)
    rest-proxy         |    at io.confluent.kafkarest.v2.KafkaConsumerReadTask.maybeAddRecord(KafkaConsumerReadTask.java:158)
    rest-proxy         |    at io.confluent.kafkarest.v2.KafkaConsumerReadTask.addRecords(KafkaConsumerReadTask.java:142)
    rest-proxy         |    at io.confluent.kafkarest.v2.KafkaConsumerReadTask.doPartialRead(KafkaConsumerReadTask.java:99)
    rest-proxy         |    at io.confluent.kafkarest.v2.KafkaConsumerManager$RunnableReadTask.run(KafkaConsumerManager.java:370)
    rest-proxy         |    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    rest-proxy         |    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    rest-proxy         |    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    rest-proxy         |    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    rest-proxy         |    at java.lang.Thread.run(Thread.java:748)
    
    消费者群体CLI

    我可以在CLI上查看消费者组,但它没有活动成员:

    $ kafka-consumer-groups --bootstrap-server broker:9092 --list
    
    结果是:

    console-consumer-60695
    console-consumer-62259
    console-consumer-19307
    console-consumer-47906
    console-consumer-40838
    rested
    
    但是,当我尝试检索
    成员时

    $ kafka-consumer-groups --bootstrap-server localhost:29092 --group rest-consumer --describe --members
    
    Consumer group 'rested' has no active members.
    
    TL;DR

    您需要将密钥用双引号括起来,不是因为所有密钥都需要用引号括起来,而是使用JSON解析器,您需要使密钥成为有效的JSON,而用双引号括起来的字符串是有效的JSON

    如果您真的需要处理此消息,您需要以不同于JSON的格式阅读它

    长答案

    您有一个记录,其中一个键没有引号,这使得值无效JSON,因此当Jackson JSON解析器尝试解析该键时,它不是有效的JSON(错误消息中没有明确说明,但当它没有看到引号、方括号或花括号时,它开始假定它是布尔值或null)

    在这里,您可以看到它在哪里抓取密钥并尝试将其解码为JSON

    我可以用这种方法重现你的错误

    curl -X POST -H "Content-Type: application/vnd.kafka.v2+json" \
          --data '{"name": "my_consumer_instance", "format": "json", "auto.offset.reset": "latest"}' \
          http://localhost:8082/consumers/my_json_consumer
    
    curl -X POST -H "Content-Type: application/vnd.kafka.v2+json" --data '{"topics":["testjsontopic"]}' \
     http://localhost:8082/consumers/my_json_consumer/instances/my_consumer_instance/subscription
    
    
    ./bin/kafka-console-producer \
      --broker-list :9092 \
      --topic testjsontopic \
      --property parse.key=true \
      --property key.separator="&"
    
    >"key"&{"foo":"bar"}
    
    *Ctrl-C
    
    curl -X GET -H "Accept: application/vnd.kafka.json.v2+json" \
          http://localhost:8082/consumers/my_json_consumer/instances/my_consumer_instance/records
    
    在这一点上,我可以读取记录,但当我添加一个没有引号的键时,我会得到与您相同的错误

    ./bin/kafka-console-producer \
      --broker-list :9092 \
      --topic testjsontopic \
      --property parse.key=true \
      --property key.separator="&"
    
    >key&{"foo":"bar"}
    
    现在当我调用这个代码时

    curl -X GET -H "Accept: application/vnd.kafka.json.v2+json" \
          http://localhost:8082/consumers/my_json_consumer/instances/my_consumer_instance/records
    
    现在我收到这个错误

    com.fasterxml.jackson.core.JsonParseException:无法识别的令牌 “key”:应为('true'、'false'或'null')

    也可以使用此选项阅读主题键

    ./bin/kafka-console-consumer --bootstrap-server localhost:9092 --topic testjsontopic --property print.key=true --from-beginning
    

    当我们关闭主题时,
    confluent 5.1.0 docker容器
    跟踪主题。可能是后者,而不是前者。因此,我使用
    kafka topics
    CLI删除了这些主题,并且它工作正常。我也有同样的问题。如果我想通过
    kafka rest
    使用没有带引号的键的日志,我会使用
    binary
    格式(
    application/vnd.kafka.binary.v2+json
    )来读取它们,但是如何从二进制格式获取真正的值呢?(注:我使用python请求查询kafka rest)