elasticsearch logstash无法在ES中创建索引,elasticsearch,logstash,filebeat,elasticsearch,Logstash,Filebeat" /> elasticsearch logstash无法在ES中创建索引,elasticsearch,logstash,filebeat,elasticsearch,Logstash,Filebeat" />

elasticsearch logstash无法在ES中创建索引

elasticsearch logstash无法在ES中创建索引,elasticsearch,logstash,filebeat,elasticsearch,Logstash,Filebeat,我正在尝试使用Logstash解析日志文件。。Filebeat从目录中读取示例日志,并通过Logstash将其索引到ElasticSearch中。(通过Filebeat从目录中读取输入文件,并在Filebeat.yml中指定将Logstash作为输出读取,在Logstash配置文件中解析日志文件,并将结果放入ES中的索引中。) Filebeat.yml #=========================== Filebeat prospectors ======================

我正在尝试使用Logstash解析日志文件。。Filebeat从目录中读取示例日志,并通过Logstash将其索引到ElasticSearch中。(通过Filebeat从目录中读取输入文件,并在Filebeat.yml中指定将Logstash作为输出读取,在Logstash配置文件中解析日志文件,并将结果放入ES中的索引中。)

Filebeat.yml

#=========================== Filebeat prospectors =============================

filebeat.prospectors:

  #input_type: log
  #input_type: log
  document_type: my_log
paths:
  - C:\logsa\elast.log

    #----------------------------- Logstash output --------------------------------
    output.logstash:
      # The Logstash hosts
      hosts: ["localhost:5044"]



elast.log : (I am trying to parse this one line of log in the log file) 

    [2016-11-03 07:30:05,987] [INFO] [o.e.p.PluginsService     ] [hTYKFFt] initializing...
日志存储配置文件:

input {
beats {
port => "5044"
}
}
filter {
if [type] == "my_log" {
grok {
match => { "message" => "\[%{TIMESTAMP_ISO8601:timestamp}\] \[%{DATA:LOGLEVEL}\] \[%{DATA:MESSAGE}\] \[%{GREEDYDATA:message}\] %{GREEDYDATA:message1}"}
}
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
}
}
我正在运行filebeat.exe、logstash conf文件和elasticsearch

运行logstash配置文件时,我没有收到任何错误

运行logstash conf时的控制台:

C:\logstash-5.0.0\logstash-5.0.0\bin>logstash -f log-h.conf
JAVA_OPTS was set to [ -Xmx1g -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:+CMSP
arallelRemarkEnabled -XX:SurvivorRatio=8 -XX:MaxTenuringThreshold=1 -XX:CMSIniti
atingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -XX:+HeapDumpOnOutO
fMemoryError -XX:HeapDumpPath="$LS_HOME/heapdump.hprof"]. Logstash will trust th
ese options, and not set any defaults that it might usually set
Sending Logstash logs to C:/logstash-5.0.0/logstash-5.0.0
/logs which is now configured via log4j2.properties.
[2016-11-08T17:38:02,452][INFO ][logstash.inputs.beats    ] Beats inputs: Starti
ng input listener {:address=>"0.0.0.0:5044"}
[2016-11-08T17:38:02,728][INFO ][org.logstash.beats.Server] Starting server on p
ort: 5044
[2016-11-08T17:38:03,082][INFO ][logstash.outputs.elasticsearch] Elasticsearch p
ool URLs updated {:changes=>{:removed=>[], :added=>["http://localhost:9200"]}}
[2016-11-08T17:38:03,089][INFO ][logstash.outputs.elasticsearch] Using mapping t
emplate from {:path=>nil}
[2016-11-08T17:38:03,324][INFO ][logstash.outputs.elasticsearch] Attempting to i
nstall template {:manage_template=>{"template"=>"logstash-*", "version"=>50001,
"settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=
>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"pa
th_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text"
, "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"str
ing", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=
>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"
=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"d
ynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_po
int"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}
}}}}}
[2016-11-08T17:38:03,359][INFO ][logstash.outputs.elasticsearch] New Elasticsear
ch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["localhost:9200"
]}
[2016-11-08T17:38:03,596][INFO ][logstash.pipeline        ] Starting pipeline {"
id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.
delay"=>5, "pipeline.max_inflight"=>500}
[2016-11-08T17:38:03,612][INFO ][logstash.pipeline        ] Pipeline main starte
d
[2016-11-08T17:38:03,783][INFO ][logstash.agent           ] Successfully started
 Logstash API endpoint {:port=>9600}
它不会在ES中创建索引,也不会像上面控制台中看到的那样出现任何错误


有人能帮忙吗?提前谢谢

您的Filebeat配置存在一些缩进问题。对于Filebeat 5.x,它应该是这样的

filebeat.prospectors:
- paths:
    - C:/logsa/elast.log
  document_type: my_log

output.logstash:
  hosts: ["localhost:5044"]
Beats文档中提供了一个日志,显示了如何配置Elasticsearch输出。这将把数据写入
filebeat YYYY.MM.DD
索引

input {
  beats {
    port => "5044"
  }   
}   

filter {
  if [type] == "my_log" {
    grok {
      match => { "message" => "\[%{TIMESTAMP_ISO8601:timestamp}\] \[%{DATA:LOGLEVEL}\] \[%{DATA:MESSAGE}\] \[%{GREEDYDATA:message}\] %{GREEDYDATA:message1}"}
    }   
  }   
}   

output {
  elasticsearch {
    hosts => "localhost:9200"
    manage_template => false
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
  }   
}   
使用Logstash时,还必须使用Filebeat索引模板进行Elasticsearch

对于Windows:

PS C:\Program Files\Filebeat>Invoke WebRequest-Method Put-infle Filebeat.template.json-Urihttp://localhost:9200/_template/filebeat?pretty

对于Unix:


curl-XPUT'http://localhost:9200/_template/filebeat'-d@/etc/filebeat/filebeat.template.json

您的filebeat配置存在一些缩进问题。对于Filebeat 5.x,它应该是这样的

filebeat.prospectors:
- paths:
    - C:/logsa/elast.log
  document_type: my_log

output.logstash:
  hosts: ["localhost:5044"]
Beats文档中提供了一个日志,显示了如何配置Elasticsearch输出。这将把数据写入
filebeat YYYY.MM.DD
索引

input {
  beats {
    port => "5044"
  }   
}   

filter {
  if [type] == "my_log" {
    grok {
      match => { "message" => "\[%{TIMESTAMP_ISO8601:timestamp}\] \[%{DATA:LOGLEVEL}\] \[%{DATA:MESSAGE}\] \[%{GREEDYDATA:message}\] %{GREEDYDATA:message1}"}
    }   
  }   
}   

output {
  elasticsearch {
    hosts => "localhost:9200"
    manage_template => false
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
  }   
}   
使用Logstash时,还必须使用Filebeat索引模板进行Elasticsearch

对于Windows:

PS C:\Program Files\Filebeat>Invoke WebRequest-Method Put-infle Filebeat.template.json-Urihttp://localhost:9200/_template/filebeat?pretty

对于Unix:

curl-XPUT'http://localhost:9200/_template/filebeat'-d@/etc/filebeat/filebeat.template.json