elasticsearch 在logstash中使用grok模式解析我的json文件?,elasticsearch,logstash,elastic-stack,logstash-grok,elasticsearch,Logstash,Elastic Stack,Logstash Grok" /> elasticsearch 在logstash中使用grok模式解析我的json文件?,elasticsearch,logstash,elastic-stack,logstash-grok,elasticsearch,Logstash,Elastic Stack,Logstash Grok" />

elasticsearch 在logstash中使用grok模式解析我的json文件?

elasticsearch 在logstash中使用grok模式解析我的json文件?,elasticsearch,logstash,elastic-stack,logstash-grok,elasticsearch,Logstash,Elastic Stack,Logstash Grok,我试图使用logstash将json文件解析为elasticsearch,但我做不到,我想我需要编写一些grok模式。但我不能。如何使用logstash将下面的json发送到elasticsearch {“machinename”:“test1” “longdate”:“2019-01-29 13:19:32” “级别”:“错误” “mysite”:“test1” “消息”:“test2” “异常”:“test3” “时间戳”:“2019-01-29T13:19:32.257Z” } 我的日志存

我试图使用logstash将json文件解析为elasticsearch,但我做不到,我想我需要编写一些grok模式。但我不能。如何使用logstash将下面的json发送到elasticsearch

{“machinename”:“test1”

“longdate”:“2019-01-29 13:19:32”

“级别”:“错误”

“mysite”:“test1”

“消息”:“test2”

“异常”:“test3”

“时间戳”:“2019-01-29T13:19:32.257Z” }

我的日志存储文件:


input {
  file {
       path => ["P:/logs/*.txt"]
        start_position => "beginning" 
        discover_interval => 10
        stat_interval => 10
        sincedb_write_interval => 10
        close_older => 10
       codec => multiline { 
        negate => true
        what => "previous" 
       }
  }
}

filter {  
 date {
            match => ["TimeStamp", "ISO8601"]
             }  
    json{
        source => "request"
        target => "parsedJson"

    }   

}   

output {  

    stdout {
        codec => rubydebug
    }



    elasticsearch {
        hosts => [ "http://localhost:9200" ]
         index => "log-%{+YYYY.MM}"

    }   
}



错误:

[2019-01-29T14:30:54907][WARN][logstash.config.source.multilocal]忽略“pipelines.yml”文件,因为指定了模块或命令行选项 [2019-01-29T14:30:56929][INFO][logstash.runner]正在启动logstash{“logstash.version”=>“6.3.2”} [2019-01-29T14:30:59167][ERROR][logstash.agent]未能执行操作{:action=>logstash::PipelineAction::Create/pipeline\u id:main,:exception=>“logstash::ConfigurationError”,:message=>“输入{\n文件{\n\t path=>[\'P:/logs/*.txt\][\n\t\t\t开始位置=>”后,第12行第18列(字节281)处应为{,}之一”开始\“\n\t\tDiscovery\u interval=>10\n\t\tstat\u interval=>10\n\t\tsincedb\u write\u interval=>10\n\t\tclose\u older=>10\n codec=>multiline{\n\t\tpattern=>\“^%{TIMESTAMP\u ISO8601}\\\\\\”\n\t\tGate=>true\n what=>P:/elk/logstash/logstash-core/lib/logstash/compiler.rb:50:incompile\u-graph',“P:/elk/logstash/logstash-core/lib/logstash/compiler.rb:12:inblock in-in-compile\u-sources',“org/jruby/RubyArray.java:2486:inmap',“P:/elk/logstash/logstash/logstash-core/lib/logstash/compiler.rb:11:in
compile>compile\u-sources”P:/elk/logstash/logstash-core/lib/logstash/pipeline.rb:49:in
initialize',“P:/elk/logstash/logstash-core/lib/logstash/pipeline.rb:167:in
initialize',“P:/elk/logstash/logstash-core/lib/logstash/pipeline\action/create.rb:40:in
execute',“P:/elk/logstash/logstash-core/lib/logstash/agent.rb:305:in `块处于收敛状态'” [2019-01-29T14:31:00417][INFO][logstash.agent]已成功启动logstash API端点{:port=>9600} [2019-01-29T14:34:23554][WARN][logstash.config.source.multilocal]忽略“pipelines.yml”文件,因为指定了模块或命令行选项 [2019-01-29T14:34:24554][INFO][logstash.runner]正在启动logstash{“logstash.version”=>“6.3.2”} [2019-01-29T14:34:27486][ERROR][logstash.codecs.multiline]缺少多行编解码器插件所需的设置:

编解码器{ 多行{ 模式=>#缺少设置 ... } } [2019-01-29T14:34:27502][ERROR][logstash.agent]未能执行操作{:action=>logstash::PipelineAction::Create/pipeline_id:main,:exception=>“logstash::ConfigurationError”,:message=>“您的配置有问题。”,:backtrace=>[“P:/elk/logstash/logstash/logstash core/lib/logstash/config/mixin.rb:89:in
config\u init'”P:/elk/logstash/logstash-core/lib/logstash/codecs/base.rb:19:in
initialize',“P:/elk/logstash/logstash-core/lib/logstash/plugins/plugin_-factory.rb:97:in
plugin',“P:/elk/logstash/logstash-core/lib/logstash/pipeline.rb:110:in
plugin'”,“eval:8:in
',“org/jruby/RubyKernel.java:994:in
eval.”P:/elk/logstash/logstash core/lib/logstash/pipeline.rb:82:in
initialize',“P:/elk/logstash/logstash core/lib/logstash/pipeline.rb:167:in
initialize',“P:/elk/logstash/logstash/logstash core/lib/logstash/pipepeline\action/create.rb:40:in
execute',“P:/elk/logstash/logstash core/lib/logstash/logstash/agent.rb:305:in
块处于收敛状态” [2019-01-29T14:34:27971][INFO][logstash.agent]已成功启动logstash API端点{:port=>9600}

您可以尝试使用for logstash

这样,logstash中的过滤器插件将解析json:

filter {
  json {
    source => "message"
  }
}
另一个很好的例子是失败时的标记。这样,如果json无效或被误解,您将在elasticsearch/kibana中看到消息,但带有jsonparsefailure标记

  filter {
      json {
        source => "message"
        tag_on_failure => [ "_jsonparsefailure" ]
      }
    }

我使用“”检查了我的json。它看起来不错。我需要grok模式吗?不,你不需要。grok它是一个过滤器插件,json也是一个过滤器插件。在这个示例中,你不需要同时使用这两个插件。只需使用json。然后,如果一切顺利,你就可以在elasticsearch中看到结果。我认为“模式=>”^%{TIMESTAMP_ISO8601}\|“创建了一个问题。我必须删除它,但它创建了缺少的错误。我删除了grok。它创建了上面的错误。请查看“错误:”部分。您的多行编解码器缺少
模式
选项。在您的用例中,它应该类似于
“%{TIMESTAMP\u ISO8601}\\\\\\”