elasticsearch logstash&x2B;elasticsearch | bug?,elasticsearch,logstash,elasticsearch,Logstash" /> elasticsearch logstash&x2B;elasticsearch | bug?,elasticsearch,logstash,elasticsearch,Logstash" />

elasticsearch logstash&x2B;elasticsearch | bug?

elasticsearch logstash&x2B;elasticsearch | bug?,elasticsearch,logstash,elasticsearch,Logstash,你能帮我解决以下问题吗:以下是什么意思? 它似乎无法连接到Elasticsearch本地节点。但是为什么呢 logstash]# bin/logstash -f logstash_exabgp.cfg --debug --verbose Using milestone 2 input plugin 'file'. This plugin should be stable, but if you see strange behavior, please let us know! For more

你能帮我解决以下问题吗:以下是什么意思? 它似乎无法连接到Elasticsearch本地节点。但是为什么呢

logstash]# bin/logstash -f logstash_exabgp.cfg --debug --verbose
Using milestone 2 input plugin 'file'. This plugin should be stable, but if you see strange behavior, please let us know! For more information on plugin milestones, see http://logstash.net/docs/1.4.2-modified/plugin-milestones {:level=>:warn}
Registering file input {:path=>["/var/log/messages"], :level=>:info}
No sincedb_path set, generating one based on the file path {:sincedb_path=>"/root/.sincedb_452905a167cf4509fd08acb964fdb20c", :path=>["/var/log/messages"], :level=>:info}
Grok patterns path {:patterns_dir=>["/opt/logstash/patterns/*"], :level=>:info}
Grok loading patterns from file {:path=>"/opt/logstash/patterns/firewalls", :level=>:info}
Grok loading patterns from file {:path=>"/opt/logstash/patterns/grok-patterns", :level=>:info}
Grok loading patterns from file {:path=>"/opt/logstash/patterns/haproxy", :level=>:info}
Grok loading patterns from file {:path=>"/opt/logstash/patterns/java", :level=>:info}
Grok loading patterns from file {:path=>"/opt/logstash/patterns/junos", :level=>:info}
Grok loading patterns from file {:path=>"/opt/logstash/patterns/linux-syslog", :level=>:info}
Grok loading patterns from file {:path=>"/opt/logstash/patterns/mcollective", :level=>:info}
Grok loading patterns from file {:path=>"/opt/logstash/patterns/mcollective-patterns", :level=>:info}
Grok loading patterns from file {:path=>"/opt/logstash/patterns/mongodb", :level=>:info}
Grok loading patterns from file {:path=>"/opt/logstash/patterns/nagios", :level=>:info}
Grok loading patterns from file {:path=>"/opt/logstash/patterns/postgresql", :level=>:info}
Grok loading patterns from file {:path=>"/opt/logstash/patterns/redis", :level=>:info}
Grok loading patterns from file {:path=>"/opt/logstash/patterns/ruby", :level=>:info}
Match data {:match=>{"message"=>"%{SYSLOGTIMESTAMP:timestamp}%{GREEDYDATA}ExaBGP:%{SPACE}%{GREEDYDATA:msg}"}, :level=>:info}
Grok compile {:field=>"message", :patterns=>["%{SYSLOGTIMESTAMP:timestamp}%{GREEDYDATA}ExaBGP:%{SPACE}%{GREEDYDATA:msg}"], :level=>:info}
Pipeline started {:level=>:info}
New Elasticsearch output {:cluster=>nil, :host=>"127.0.0.1", :port=>"9200", :embedded=>false, :protocol=>"http", :level=>:info}
Automatic template management enabled {:manage_template=>"true", :level=>:info}
Using mapping template {:template=>"{  \"template\" : \"logstash-*\",  \"settings\" : {    \"index.refresh_interval\" : \"5s\"  },  \"mappings\" : {    \"_default_\" : {       \"_all\" : {\"enabled\" : true},       \"dynamic_templates\" : [ {         \"string_fields\" : {           \"match\" : \"*\",           \"match_mapping_type\" : \"string\",           \"mapping\" : {             \"type\" : \"string\", \"index\" : \"analyzed\", \"omit_norms\" : true,               \"fields\" : {                 \"raw\" : {\"type\": \"string\", \"index\" : \"not_analyzed\", \"ignore_above\" : 256}               }           }         }       } ],       \"properties\" : {         \"@version\": { \"type\": \"string\", \"index\": \"not_analyzed\" },         \"geoip\"  : {           \"type\" : \"object\",             \"dynamic\": true,             \"path\": \"full\",             \"properties\" : {               \"location\" : { \"type\" : \"geo_point\" }             }         }       }    }  }}", :level=>:info}
NoMethodError: undefined method `tv_sec' for nil:NilClass
        sprintf at /opt/logstash/lib/logstash/event.rb:230
           gsub at org/jruby/RubyString.java:3041
        sprintf at /opt/logstash/lib/logstash/event.rb:216
        receive at /opt/logstash/lib/logstash/outputs/elasticsearch.rb:308
         handle at /opt/logstash/lib/logstash/outputs/base.rb:86
     initialize at (eval):72
           call at org/jruby/RubyProc.java:271
         output at /opt/logstash/lib/logstash/pipeline.rb:266
   outputworker at /opt/logstash/lib/logstash/pipeline.rb:225
  start_outputs at /opt/logstash/lib/logstash/pipeline.rb:152
而配置文件如下所示:

logstash]# cat logstash_exabgp.cfg 
input   {
    file    {
        path    =>  ["/var/log/messages"]
    }
}
filter  {
    if [message] !~ /ExaBGP/ { 
            drop { } 
    }
    grok    {
        match   =>  [ "message", "%{SYSLOGTIMESTAMP:timestamp}%{GREEDYDATA}ExaBGP:%{SPACE}%{GREEDYDATA:msg}"]
        remove_field    =>  [ "message", "host", "path", "@timestamp", "@version" ]
    }
    date    {
        match   =>  ["logdate", "MMM dd HH:mm:ss"]
    }
}
output  {
#   file    {
#       path    =>  "NIKOS.txt"
#   }
#   stdout { codec => rubydebug }
    elasticsearch { 
        host    =>  "127.0.0.1"
        protocol    =>  http    
    }
}

我想这是你第一次运行logstash。这里的问题是,logstash无法找到有关您所引用的文件的信息

使用以下代码并尝试为要解析的文件提供绝对路径

文件{

       path =>  ["/var/log/messages"]
       start_position => "beginning" 

}logstash在内部使用任何带@前缀的fiedl,删除它们往往会导致错误。

我也有这个问题,所以我从输入中删除了文件,我使用了:

input 
 {
   stdin {

    }
       }
 . . . 
您必须以这种方式执行logstash:

bin/logstash--config/home/logstash/conf/ex.conf
因为输入中的文件不再工作了

对。这是我第一次运行日志存储。不,这不是原因。在拆除grok过滤器后,即可进行工作。解决方法与子过滤器有关。有什么想法吗?如果[留言]!~/ExaBGP/使用此条件您打算实现什么?有一个使用ExaBGP工具写入syslog的过程。我只对该设施生成的消息感兴趣。这也可以。紧接着的部分是有问题的。