Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Logstash 无法读取卡夫卡-Avro架构消息_Logstash_Apache Kafka_Avro - Fatal编程技术网

Logstash 无法读取卡夫卡-Avro架构消息

Logstash 无法读取卡夫卡-Avro架构消息,logstash,apache-kafka,avro,Logstash,Apache Kafka,Avro,这个问题有什么解决办法吗???我无法读取KAFKA-AVRO架构消息。我正在尝试从logstash向KAFKA向HDFS发送消息 以下是技术堆栈: Logstash 2.3-当前生产版本 汇合3.0 插件: A.LogstashKafka输出插件 BLogstash编解码器avro 动物园管理员:3.4.6 卡夫卡:0.10.0.0 日志存储配置文件如下所示: input { stdin{} } filter { mutate { remove_field => ["@timestamp

这个问题有什么解决办法吗???我无法读取KAFKA-AVRO架构消息。我正在尝试从logstash向KAFKA向HDFS发送消息

以下是技术堆栈:

  • Logstash 2.3-当前生产版本
  • 汇合3.0
  • 插件: A.LogstashKafka输出插件 BLogstash编解码器avro
  • 动物园管理员:3.4.6
  • 卡夫卡:0.10.0.0
  • 日志存储配置文件如下所示:

    input {
    stdin{}
    }
    
    filter {
    mutate {
    remove_field => ["@timestamp","@version"]
      }
    }
    
    output {
      kafka {
    topic_id => 'logstash_logs14'
    
    codec => avro  { 
    schema_uri => "/opt/logstash/bin/schema.avsc"
        }
      }
    }
    
    {
        "type":"record",
        "name":"myrecord",
        "fields":[
            {"name":"message","type":"string"},
            {"name":"host","type":"string"}
            ]
    }
    
    schema.avsc文件如下所示:

    input {
    stdin{}
    }
    
    filter {
    mutate {
    remove_field => ["@timestamp","@version"]
      }
    }
    
    output {
      kafka {
    topic_id => 'logstash_logs14'
    
    codec => avro  { 
    schema_uri => "/opt/logstash/bin/schema.avsc"
        }
      }
    }
    
    {
        "type":"record",
        "name":"myrecord",
        "fields":[
            {"name":"message","type":"string"},
            {"name":"host","type":"string"}
            ]
    }
    
    已运行以下命令:

  • 在自己的终端中启动Zookeeper

    ./bin/kafka-server-start ./etc/kafka/server.properties
    
    ./bin/schema-registry-start ./etc/schema-registry/schema-registry.properties
    
    ./bin/zookeeper服务器启动。/etc/kafka/zookeeper.properties

  • 2在自己的终端启动卡夫卡

    ./bin/kafka-server-start ./etc/kafka/server.properties
    
    ./bin/schema-registry-start ./etc/schema-registry/schema-registry.properties
    
    3在自己的终端中启动模式注册表

    ./bin/kafka-server-start ./etc/kafka/server.properties
    
    ./bin/schema-registry-start ./etc/schema-registry/schema-registry.properties
    
    4从logstash目录运行以下命令

    bin/logstash -f ./bin/logstash.conf
    
    5在运行上述命令后,键入要发送给kafka的日志消息 例:“你好,世界”

    6.阅读卡夫卡的主题

    ./bin/kafka-avro-console-consumer --zookeeper localhost:2181 --topic logstash_logs14 --from-beginning
    _While consuming we get the following error:_
    
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/root/confluent-3.0.0/share/java/kafka-serde-tools/slf4j-log4j12-1.7.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/root/confluent-3.0.0/share/java/confluent-common/slf4j-log4j12-1.7.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/root/confluent-3.0.0/share/java/schema-registry/slf4j-log4j12-1.7.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    Processed a total of 1 messages
    [2016-06-08 18:42:41,627] ERROR Unknown error when running consumer:  (kafka.tools.ConsoleConsumer$:103)
    org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id -1
    Caused by: org.apache.kafka.common.errors.SerializationException: Unknown magic byte!
    [2016-06-08 18:42:41,627] ERROR Unknown error when running consumer:  (kafka.tools.ConsoleConsumer$:103)
    org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id -1
    Caused by: org.apache.kafka.common.errors.SerializationException: Unknown magic byte!
    
    请告诉我如何解决这个问题

    谢谢,
    Upendra

    你是如何为卡夫卡写作/出版的?您会看到序列化异常,因为数据不是使用架构注册表(或KafkaAvroSerializer)写入的,而是在使用架构注册表时,kafka avro控制台使用者在内部使用架构注册表(或KafkaAvroSerializer),该注册表预期数据为特定格式(特别是
    )。如果您使用kafka avro console producer写入avro数据,则不应出现此异常,或者您可以在producer属性中为键和值序列化程序设置,还可以设置架构注册表url

    Properties props = new Properties();
    props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
    props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,
          io.confluent.kafka.serializers.KafkaAvroSerializer.class);
    props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,
          io.confluent.kafka.serializers.KafkaAvroSerializer.class);
    props.put("schema.registry.url", "http://localhost:8081");
    

    也许回答得太晚了,但现在面临着同样的问题

    Logstrash在这里使用默认序列化程序, “org.apache.kafka.common.serialization.StringSerializer”

    因此,如果您想从事件总线读取Avro消息,必须在logstash输出上使用KafkaAvroSerializers对其进行序列化 “io.confluent.kafka.serializers.KafkaAvroSerializer”

    然后从使用者部分使用匹配的反序列化器。
    问题是logstash根本不识别IO.CONFLUENT,所以您必须做一些棘手的事情来添加它,正如deps和jars所解决的那样,LS_HOME/vendor/lib中的kafkavroserializer问题droping jar/在classpath vendor/lib中添加每个jar看起来像:avro-1.10.0.jar common-config-5.5.1.jar jni/jruby.jar kafka-avro-serializer-5.3.0.jar kafka-clients-2.5.0.jar kafka-schema-registry-client-5.3.0.jar kafka-kafka-streams-serde-5.5.1.jar ruby/