elasticsearch Kibana不创建,我的数据的标记 我需要什么?,elasticsearch,logstash,kibana,elasticsearch,Logstash,Kibana" /> elasticsearch Kibana不创建,我的数据的标记 我需要什么?,elasticsearch,logstash,kibana,elasticsearch,Logstash,Kibana" />

elasticsearch Kibana不创建,我的数据的标记 我需要什么?

elasticsearch Kibana不创建,我的数据的标记 我需要什么?,elasticsearch,logstash,kibana,elasticsearch,Logstash,Kibana,我需要能够创建一个索引与我的日志文件的数据 这是/etc/logstash/conf.d/apache-01.conf,我已经尝试将/dev/null用于sincedb并将其删除。sincedb_xxx文件来自/var/lib/logstash/plugins/inputs/file: 当我执行命令时 >> curl http://localhost:9200/_cat/indices green open .kibana N2gR01kcS

我需要能够创建一个索引与我的日志文件的数据

这是/etc/logstash/conf.d/apache-01.conf,我已经尝试将/dev/null用于sincedb并将其删除。sincedb_xxx文件来自/var/lib/logstash/plugins/inputs/file:

当我执行命令时

>> curl http://localhost:9200/_cat/indices
green  open .kibana                     N2gR01kcSMaT74Pj93NqwA 1 0     1 0   4kb   4kb
yellow open metricbeat-6.4.3-2018.11.08 rpBMeq-XS7yGeOd49Wakhw 1 1 14285 0 7.9mb 7.9mb
通常应返回一个分段文件作为logstash-2018.11.01

在日志文件中,/var/log/logstash/logstash-plain.log显示:

[2018-11-08T10:05:15,808][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.4.3"}
[2018-11-08T10:05:17,493][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-11-08T10:05:17,825][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-11-08T10:05:17,834][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-11-08T10:05:17,997][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-11-08T10:05:18,048][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-11-08T10:05:18,051][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-11-08T10:05:18,075][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2018-11-08T10:05:18,093][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-11-08T10:05:18,109][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-11-08T10:05:18,212][INFO ][logstash.filters.geoip   ] Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-geoip-5.0.3-java/vendor/GeoLite2-City.mmdb"}
[2018-11-08T10:05:18,440][INFO ][logstash.inputs.file     ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/var/lib/logstash/plugins/inputs/file/.sincedb_ae5e62ef229d5a1776eda86789823900", :path=>["/test/domainname*"]}
[2018-11-08T10:05:18,528][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x45edc3cf run>"}
[2018-11-08T10:05:18,566][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2018-11-08T10:05:18,576][INFO ][filewatch.observingtail  ] START, creating Discoverer, Watch with file and sincedb collections
[2018-11-08T10:05:18,784][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
我试过什么? 使用/dev/null代替.sincedb默认值 删除.sincedb文件 更改输入{}中的id名称 更改输出{}中的索引名 使用curl-XDELETE localhost:9200/* 更改文件和文件夹权限 killall-9Java 所有相关kibana服务停止、重新启动 重新启动电脑 阅读了很多关于这个问题的话题 重要提示:当我第一次执行它时,所有的工作都很好,但现在不工作了

注:我是这个话题的初学者
谢谢。

在使用文件输入调试日志存储管道时,我喜欢使用stdin和stdout来简化它

然后

我们的目标是看看我们是否能通过网络获取数据。如果是这样,则说明文件输入配置或Elasticsearch输出有问题。把每一个都搞乱,直到你知道哪一个不起作用。另外,通过执行类似stat/path/to/your/file的操作,确保确实可以读取文件

通常,文件输入的问题是权限或sincedb,但听起来您已经能够消除这两个问题。在这种情况下,我希望stdin会成功。

因为curl to _cat/index没有显示索引,所以需要深入研究logstash的行为。按照下面的建议添加调试输出是很好的第一步。
[2018-11-08T10:05:15,808][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.4.3"}
[2018-11-08T10:05:17,493][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-11-08T10:05:17,825][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-11-08T10:05:17,834][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-11-08T10:05:17,997][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-11-08T10:05:18,048][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-11-08T10:05:18,051][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-11-08T10:05:18,075][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2018-11-08T10:05:18,093][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-11-08T10:05:18,109][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-11-08T10:05:18,212][INFO ][logstash.filters.geoip   ] Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-geoip-5.0.3-java/vendor/GeoLite2-City.mmdb"}
[2018-11-08T10:05:18,440][INFO ][logstash.inputs.file     ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/var/lib/logstash/plugins/inputs/file/.sincedb_ae5e62ef229d5a1776eda86789823900", :path=>["/test/domainname*"]}
[2018-11-08T10:05:18,528][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x45edc3cf run>"}
[2018-11-08T10:05:18,566][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2018-11-08T10:05:18,576][INFO ][filewatch.observingtail  ] START, creating Discoverer, Watch with file and sincedb collections
[2018-11-08T10:05:18,784][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
"index-graveyard" : {
  "tombstones" : [
    {
      "index" : {
        "index_name" : "logstash-2018.11.01",
        "index_uuid" : "K8CPa4gYTSO-l4NfnrtTog"
      },
      "delete_date_in_millis" : 1541649663075
    },
    {
      "index" : {
        "index_name" : "logstash-2018.09.01",
        "index_uuid" : "-thB_LnfQlax6tLcS11Srg"
      },
      "delete_date_in_millis" : 1541649663075
    },
    {
      "index" : {
        "index_name" : "logstash-2018.10.31",
        "index_uuid" : "Fm8XcdcTTT2U-Xm1Vw0Gbw"
      },
      "delete_date_in_millis" : 1541649663075
    },
    {
      "index" : {
        "index_name" : "logstash-2018.08.31",
        "index_uuid" : "_FqmkcRNTKOx1oJbnpeyjw"
      },
      "delete_date_in_millis" : 1541649663075
    },
    {
      "index" : {
        "index_name" : "logstash-2018.11.02",
        "index_uuid" : "ZU04EZDaS_eeqD0auI9o5Q"
      },
      "delete_date_in_millis" : 1541649663075
    },
    {
      "index" : {
        "index_name" : ".kibana",
        "index_uuid" : "sZEoKhVlRRy7e8gAAnAEZw"
      },
      "delete_date_in_millis" : 1541653339359
    },
    {
      "index" : {
        "index_name" : "metricbeat-6.4.1-2018.11.06",
        "index_uuid" : "T5UZFMHiRJSMsBjTw40ztA"
      },
input {
  stdin {}
}
filter {
  ...your filter
}
output {
  stdout { codec => rubydebug }
}
cat mylogfile > logstash -f mypipeline.conf