Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/magento/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
<img src="//i.stack.imgur.com/RUiNP.png" height="16" width="18" alt="" class="sponsor tag img">elasticsearch Logstash多行自定义json解析失败_<img Src="//i.stack.imgur.com/RUiNP.png" Height="16" Width="18" Alt="" Class="sponsor Tag Img">elasticsearch_Logstash_Apache Kafka_Multiline - Fatal编程技术网 elasticsearch Logstash多行自定义json解析失败,elasticsearch,logstash,apache-kafka,multiline,elasticsearch,Logstash,Apache Kafka,Multiline" /> elasticsearch Logstash多行自定义json解析失败,elasticsearch,logstash,apache-kafka,multiline,elasticsearch,Logstash,Apache Kafka,Multiline" />

elasticsearch Logstash多行自定义json解析失败

elasticsearch Logstash多行自定义json解析失败,elasticsearch,logstash,apache-kafka,multiline,elasticsearch,Logstash,Apache Kafka,Multiline,我有一个带有json对象的Kafka队列。我用一个基于java的离线生产者来填充这个队列。json对象的结构如示例所示: { "key": "999998", "message" : "dummy \n Messages \n Line 1 ", "type" : "app_event", "stackTrace" : "dummyTraces", "tags" : "dummyTags" } 请注意消息中的\n 我用一百万个对象加载了

我有一个带有json对象的Kafka队列。我用一个基于java的离线生产者来填充这个队列。json对象的结构如示例所示:

{ 
    "key": "999998", 
    "message"  : "dummy \n Messages \n Line 1 ", 
    "type"  : "app_event", 
    "stackTrace" : "dummyTraces", 
    "tags" : "dummyTags" 

}
请注意消息中的\n

我用一百万个对象加载了队列,并用以下脚本启动了logstash:

input {
        kafka {
                zk_connect => "localhost:2181"
                topic_id => "MemoryTest"
                type => "app_event"
                group_id => "dash_prod"
        } 
}

filter{
        if [type] == "app_event" {
                multiline {
                         pattern => "^\s"
                         what => "previous"
                }
        }
}

output {
    if [type] == "app_event" {
        stdout { 
         codec => rubydebug 
         }

          elasticsearch {
                  host => "localhost"
                  protocol => "http"
                  port => "9200"
                  index => "app_events"
                  index_type => "event"     
          }
    }
}
多行筛选器应从消息字段中删除。当我开始记录时,我开始发现两个问题:

没有一个事件被推到弹性中。我收到错误:\ jsonparsefailure。还要注意,一个事件的消息“吞噬”了连续的事件

{ message=>{\n\t\key\:\146982\,\n\t\message\:\dummytages\n行1\,\n\t\type\:\app\u event\,\n\t\stackTrace\:\dummyTraces\,\n\t\tags\:\dummyTags\\n\t\n}\n\t\key\:\146983\,\n\t\Messages\:\n\dummytages\n行1\,\n\n\t\t\t\type\:\stackTrace\:\app u event\,\n\n\t\t\t\tacks\n{\n\t\key\:\146984\,\n\t\message\:\dummy\n Messages\n Line 1\,\n\t\type\:\app\u event\,\n\t\stackTrace\:\dummyTraces\,\n\t\tags\:\dummyTags\\n\t\n}, 标签=>[ [0]\u jsonparsefailure, 多行 ], @版本=>1, @时间戳=>2015-09-21T18:38:32.005Z, 类型=>app\u事件 }

几分钟后,可用堆内存达到上限,logstash停止

此问题随附内存配置文件。13分钟后,logstash命中内存上限并停止响应


我试图了解如何在这种情况下使用多行,以及是什么导致内存崩溃。

要替换字符串的一部分,请使用


正如您所发现的,多行是将多个事件合并为一个事件。

您希望“消息”字段变成什么?我希望\n从消息中删除。
filter {
  mutate {
    gsub => [
      # replace all forward slashes with underscore
      "fieldname", "/", "_",
      # replace backslashes, question marks, hashes, and minuses
      # with a dot "."
      "fieldname2", "[\\?#-]", "."
    ]
  }
}