elasticsearch Logstash启动,但不会将数据加载到elasticsearch,elasticsearch,logstash,kibana,elasticsearch,Logstash,Kibana" /> elasticsearch Logstash启动,但不会将数据加载到elasticsearch,elasticsearch,logstash,kibana,elasticsearch,Logstash,Kibana" />

elasticsearch Logstash启动,但不会将数据加载到elasticsearch

elasticsearch Logstash启动,但不会将数据加载到elasticsearch,elasticsearch,logstash,kibana,elasticsearch,Logstash,Kibana,我试图从日志文件中读取json格式的信息,并将其加载到elasticsearch中。我用的是logstash 操作系统:windows 10 ElastiSearch版本为7.6.2 LogStash版本是7.6.2 日志文件内容如下所示: {"@timestamp":"2020-05-03T15:09:38.255+02:00","@version":1,"message":"The following profiles are active: default","logger_name":"

我试图从日志文件中读取json格式的信息,并将其加载到elasticsearch中。我用的是logstash

操作系统:windows 10 ElastiSearch版本为7.6.2 LogStash版本是7.6.2

日志文件内容如下所示:

{"@timestamp":"2020-05-03T15:09:38.255+02:00","@version":1,"message":"The following profiles are active: default","logger_name":"payroll.employee.EmployeeApplication","thread_name":"main","level":"INFO","level_value":20000,"springAppName":"employee"}
{"@timestamp":"2020-05-03T15:09:59.136+02:00","@version":1,"message":"Started EmployeeApplication in 24.892 seconds (JVM running for 27.193)","logger_name":"payroll.employee.EmployeeApplication","thread_name":"main","level":"INFO","level_value":20000,"springAppName":"employee"}
[2020-05-03T17:52:08,394][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2020-05-03T17:52:08,456][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2020-05-03T17:52:08,519][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been created for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
[2020-05-03T17:52:08,534][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["C:/Users/User/Desktop/Apps/logstash-7.6.2/config/logstash.conf"], :thread=>"#<Thread:0x6e5e96fc run>"}
[2020-05-03T17:52:08,597][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-05-03T17:52:10,327][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-05-03T17:52:10,380][INFO ][filewatch.observingtail  ][main] START, creating Discoverer, Watch with file and sincedb collections
[2020-05-03T17:52:10,411][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-05-03T17:52:10,929][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
下面是logstash.conf文件

input {
    file {
        path => "C:/Users/User/Desktop/Git-Repos/Microservice/elk-logs/employee.log"
        start_position => "beginning"
        sincedb_path => "NUL"
    }
}

output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    index => "logback-%{+YYYY.MM.dd}"
    #user => "elastic"
    #password => "changeme"
  }
}
我运行日志存储,如下所示:

logstash.bat -f C:\Users\User\Desktop\Apps\logstash-7.6.2\config\logstash.conf
输出如下:

{"@timestamp":"2020-05-03T15:09:38.255+02:00","@version":1,"message":"The following profiles are active: default","logger_name":"payroll.employee.EmployeeApplication","thread_name":"main","level":"INFO","level_value":20000,"springAppName":"employee"}
{"@timestamp":"2020-05-03T15:09:59.136+02:00","@version":1,"message":"Started EmployeeApplication in 24.892 seconds (JVM running for 27.193)","logger_name":"payroll.employee.EmployeeApplication","thread_name":"main","level":"INFO","level_value":20000,"springAppName":"employee"}
[2020-05-03T17:52:08,394][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2020-05-03T17:52:08,456][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2020-05-03T17:52:08,519][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been created for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
[2020-05-03T17:52:08,534][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["C:/Users/User/Desktop/Apps/logstash-7.6.2/config/logstash.conf"], :thread=>"#<Thread:0x6e5e96fc run>"}
[2020-05-03T17:52:08,597][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-05-03T17:52:10,327][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-05-03T17:52:10,380][INFO ][filewatch.observingtail  ][main] START, creating Discoverer, Watch with file and sincedb collections
[2020-05-03T17:52:10,411][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-05-03T17:52:10,929][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

现在我更糊涂了。问题在于logstash、elasticsearch或kibana无法在kibana中看到聊天中讨论的任何信息。问题在于索引名的模式


将模式更改为
logback-*
非常有效

尝试以下操作,查看日志存储是否正在读取文件,以确保它是elasticsearch的错误<代码>输出{stdout{codec=>json}感谢您的回复。我添加了这一行,似乎logstash正在读取日志文件,但没有添加到elasticsearch中。这是elasticsearch或kibana的故障?很可能是elasticsearch。尝试访问以下端点:
curl-xget“localhost:9200//\u search?q=*:*”
。您能找到插入的记录吗?如何在Kibana中查询?你在搜索什么索引?你的索引名是
logback-*
。也许这就是问题所在。经过这次修改后,一切都正常了。非常感谢你。