elasticsearch Ubuntu上的Logstash Elasticsearch Kibana设置,elasticsearch,logstash,kibana,elasticsearch,Logstash,Kibana" /> elasticsearch Ubuntu上的Logstash Elasticsearch Kibana设置,elasticsearch,logstash,kibana,elasticsearch,Logstash,Kibana" />

elasticsearch Ubuntu上的Logstash Elasticsearch Kibana设置

elasticsearch Ubuntu上的Logstash Elasticsearch Kibana设置,elasticsearch,logstash,kibana,elasticsearch,Logstash,Kibana,我试图在我的ubuntu沙盒上设置ELK堆栈,但遇到了一些问题。问题是Logstash没有向Elasticsearch发送数据。我参考了Elasticsearch文档 看起来Kibana和Elasticsearch连接工作正常,我认为Kibana报告的是它找不到数据。花了几个小时想弄明白,但运气不好 感谢您为解决此问题所做的任何努力。多谢各位 这是我的设置细节 日志存储设置: sirishg@sirishg-vm:/u02/app/logstash-2.1.1/bin$ ./logstash -

我试图在我的ubuntu沙盒上设置ELK堆栈,但遇到了一些问题。问题是Logstash没有向Elasticsearch发送数据。我参考了Elasticsearch文档

看起来Kibana和Elasticsearch连接工作正常,我认为Kibana报告的是它找不到数据。花了几个小时想弄明白,但运气不好

感谢您为解决此问题所做的任何努力。多谢各位

这是我的设置细节

日志存储设置:

sirishg@sirishg-vm:/u02/app/logstash-2.1.1/bin$ ./logstash -f /u02/app/logstash-2.1.1/first-pipeline.conf 
Settings: Default filter workers: 1
Logstash startup completed
        # The #  character at the beginning of a line indicates a comment. Use comments to describe your configuration.
    input {
        file {
            path => "/u02/app/logstash-tutorial-dataset.log"
            start_position => beginning
        }
    }
    filter {
        grok {
            match => { "message" => "%{COMBINEDAPACHELOG}"}
        }
        geoip {
            source => "clientip"
        }
    }
    output {
        elasticsearch {
         hosts => ["localhost:9200"]
        }
        stdout {
         codec => rubydebug
        }
    }
sirishg@sirishg-vm:/u02/app/kibana-4.3.1-linux-x86/bin$ ./kibana 
  log   [18:18:36.697] [info][status][plugin:kibana] Status changed from uninitialized to green - Ready
  log   [18:18:36.786] [info][status][plugin:elasticsearch] Status changed from uninitialized to yellow - Waiting for Elasticsearch
  log   [18:18:36.852] [info][status][plugin:kbn_vislib_vis_types] Status changed from uninitialized to green - Ready
  log   [18:18:36.875] [info][status][plugin:markdown_vis] Status changed from uninitialized to green - Ready
  log   [18:18:36.883] [info][status][plugin:metric_vis] Status changed from uninitialized to green - Ready
  log   [18:18:36.907] [info][status][plugin:spyModes] Status changed from uninitialized to green - Ready
  log   [18:18:36.936] [info][status][plugin:statusPage] Status changed from uninitialized to green - Ready
  log   [18:18:36.950] [info][status][plugin:table_vis] Status changed from uninitialized to green - Ready
  log   [18:18:37.078] [info][listening] Server running at http://0.0.0.0:5601
  log   [18:18:37.446] [info][status][plugin:elasticsearch] Status changed from yellow to green - Kibana index ready
Error: Please specify a default index pattern
KbnError@http://localhost:5601/bundles/commons.bundle.js:58172:21
NoDefaultIndexPattern@http://localhost:5601/bundles/commons.bundle.js:58325:6
loadDefaultIndexPattern/<@http://localhost:5601/bundles/kibana.bundle.js:97911:1
processQueue@http://localhost:5601/bundles/commons.bundle.js:42358:29
scheduleProcessQueue/<@http://localhost:5601/bundles/commons.bundle.js:42374:28
$RootScopeProvider/this.$get</Scope.prototype.$eval@http://localhost:5601/bundles/commons.bundle.js:43602:17
$RootScopeProvider/this.$get</Scope.prototype.$digest@http://localhost:5601/bundles/commons.bundle.js:43413:16
$RootScopeProvider/this.$get</Scope.prototype.$apply@http://localhost:5601/bundles/commons.bundle.js:43710:14
$LocationProvider/this.$get</<@http://localhost:5601/bundles/commons.bundle.js:39839:14
jQuery.event.dispatch@http://localhost:5601/bundles/commons.bundle.js:22720:16
jQuery.event.add/elemData.handle@http://localhost:5601/bundles/commons.bundle.js:22407:7
  {:timestamp=>"2016-01-17T11:07:06.287000-0500", :message=>"Reading config file", :config_file=>"/u02/app/logstash-2.1.1/first-pipeline.conf", :level=>:debug, :file=>"logstash/agent.rb", :line=>"325", :method=>"local_config"}
{:timestamp=>"2016-01-17T11:07:06.420000-0500", :message=>"Compiled pipeline code:\n        @inputs = []\n        @filters = []\n        @outputs = []\n        @periodic_flushers = []\n        @shutdown_flushers = []\n\n          @input_file_1 = plugin(\"input\", \"file\", LogStash::Util.hash_merge_many({ \"path\" => (\"/u02/app/logstash-tutorial-dataset.log\") }, { \"start_position\" => (\"beginning\") }))\n\n          @inputs << @input_file_1\n\n          @filter_grok_2 = plugin(\"filter\", \"grok\", LogStash::Util.hash_merge_many({ \"match\" => {(\"message\") => (\"%{COMBINEDAPACHELOG}\")} }))\n\n          @filters << @filter_grok_2\n\n            @filter_grok_2_flush = lambda do |options, &block|\n              @logger.debug? && @logger.debug(\"Flushing\", :plugin => @filter_grok_2)\n\n              events = @filter_grok_2.flush(options)\n\n              return if events.nil? || events.empty?\n\n              @logger.debug? && @logger.debug(\"Flushing\", :plugin => @filter_grok_2, :events => events)\n\n                          events = @filter_geoip_3.multi_filter(events)\n  \n\n\n              events.each{|e| block.call(e)}\n            end\n\n            if @filter_grok_2.respond_to?(:flush)\n              @periodic_flushers << @filter_grok_2_flush if @filter_grok_2.periodic_flush\n              @shutdown_flushers << @filter_grok_2_flush\n            end\n\n          @filter_geoip_3 = plugin(\"filter\", \"geoip\", LogStash::Util.hash_merge_many({ \"source\" => (\"clientip\") }))\n\n          @filters << @filter_geoip_3\n\n            @filter_geoip_3_flush = lambda do |options, &block|\n              @logger.debug? && @logger.debug(\"Flushing\", :plugin => @filter_geoip_3)\n\n              events = @filter_geoip_3.flush(options)\n\n              return if events.nil? || events.empty?\n\n              @logger.debug? && @logger.debug(\"Flushing\", :plugin => @filter_geoip_3, :events => events)\n\n                \n\n              events.each{|e| block.call(e)}\n            end\n\n            if @filter_geoip_3.respond_to?(:flush)\n              @periodic_flushers << @filter_geoip_3_flush if @filter_geoip_3.periodic_flush\n              @shutdown_flushers << @filter_geoip_3_flush\n            end\n\n          @output_elasticsearch_4 = plugin(\"output\", \"elasticsearch\", LogStash::Util.hash_merge_many({ \"hosts\" => [(\"localhost:9200\")] }))\n\n          @outputs << @output_elasticsearch_4\n\n          @output_stdout_5 = plugin(\"output\", \"stdout\", LogStash::Util.hash_merge_many({ \"codec\" => (\"rubydebug\") }))\n\n          @outputs << @output_stdout_5\n\n  def filter_func(event)\n    events = [event]\n    @logger.debug? && @logger.debug(\"filter received\", :event => event.to_hash)\n              events = @filter_grok_2.multi_filter(events)\n              events = @filter_geoip_3.multi_filter(events)\n    \n    events\n  end\n  def output_func(event)\n    @logger.debug? && @logger.debug(\"output received\", :event => event.to_hash)\n    @output_elasticsearch_4.handle(event)\n    @output_stdout_5.handle(event)\n    \n  end", :level=>:debug, :file=>"logstash/pipeline.rb", :line=>"38", :method=>"initialize"}
{:timestamp=>"2016-01-17T11:07:06.426000-0500", :message=>"Plugin not defined in namespace, checking for plugin file", :type=>"input", :name=>"file", :path=>"logstash/inputs/file", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"}
{:timestamp=>"2016-01-17T11:07:06.451000-0500", :message=>"Plugin not defined in namespace, checking for plugin file", :type=>"codec", :name=>"plain", :path=>"logstash/codecs/plain", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"}
{:timestamp=>"2016-01-17T11:07:06.465000-0500", :message=>"config LogStash::Codecs::Plain/@charset = \"UTF-8\"", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.468000-0500", :message=>"config LogStash::Inputs::File/@path = [\"/u02/app/logstash-tutorial-dataset.log\"]", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.469000-0500", :message=>"config LogStash::Inputs::File/@start_position = \"beginning\"", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.472000-0500", :message=>"config LogStash::Inputs::File/@codec = <LogStash::Codecs::Plain charset=>\"UTF-8\">", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.480000-0500", :message=>"config LogStash::Inputs::File/@add_field = {}", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.481000-0500", :message=>"config LogStash::Inputs::File/@stat_interval = 1", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.492000-0500", :message=>"config LogStash::Inputs::File/@discover_interval = 15", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.493000-0500", :message=>"config LogStash::Inputs::File/@sincedb_write_interval = 15", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.496000-0500", :message=>"config LogStash::Inputs::File/@delimiter = \"\\n\"", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.498000-0500", :message=>"Plugin not defined in namespace, checking for plugin file", :type=>"filter", :name=>"grok", :path=>"logstash/filters/grok", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"}
{:timestamp=>"2016-01-17T11:07:06.515000-0500", :message=>"config LogStash::Filters::Grok/@match = {\"message\"=>\"%{COMBINEDAPACHELOG}\"}", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.524000-0500", :message=>"config LogStash::Filters::Grok/@add_tag = []", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.532000-0500", :message=>"config LogStash::Filters::Grok/@remove_tag = []", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.535000-0500", :message=>"config LogStash::Filters::Grok/@add_field = {}", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.536000-0500", :message=>"config LogStash::Filters::Grok/@remove_field = []", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
    sirishg@sirishg-vm:/u02/app/elasticsearch-2.1.1/bin$ ./elasticsearch
[2016-01-17 11:00:23,467][INFO ][node                     ] [node-1] version[2.1.1], pid[3418], build[40e2c53/2015-12-15T13:05:55Z]
[2016-01-17 11:00:23,470][INFO ][node                     ] [node-1] initializing ...
[2016-01-17 11:00:23,698][INFO ][plugins                  ] [node-1] loaded [], sites []
[2016-01-17 11:00:23,853][INFO ][env                      ] [node-1] using [1] data paths, mounts [[/ (/dev/sda1)]], net usable_space [12.6gb], net total_space [45.1gb], spins? [possibly], types [ext4]
[2016-01-17 11:00:27,412][INFO ][node                     ] [node-1] initialized
[2016-01-17 11:00:27,412][INFO ][node                     ] [node-1] starting ...
[2016-01-17 11:00:27,605][INFO ][transport                ] [node-1] publish_address {localhost/127.0.0.1:9300}, bound_addresses {127.0.0.1:9300}
[2016-01-17 11:00:27,616][INFO ][discovery                ] [node-1] my-application/rd4S1ZOdQXOj3_g-N22NnQ
[2016-01-17 11:00:31,121][INFO ][cluster.service          ] [node-1] new_master {node-1}{rd4S1ZOdQXOj3_g-N22NnQ}{127.0.0.1}{localhost/127.0.0.1:9300}, reason: zen-disco-join(elected_as_master, [0] joins received)
[2016-01-17 11:00:31,259][INFO ][http                     ] [node-1] publish_address {localhost/127.0.0.1:9200}, bound_addresses {127.0.0.1:9200}
[2016-01-17 11:00:31,260][INFO ][node                     ] [node-1] started
[2016-01-17 11:00:31,830][INFO ][gateway                  ] [node-1] recovered [2] indices into cluster_state
first pipeline.conf:

sirishg@sirishg-vm:/u02/app/logstash-2.1.1/bin$ ./logstash -f /u02/app/logstash-2.1.1/first-pipeline.conf 
Settings: Default filter workers: 1
Logstash startup completed
        # The #  character at the beginning of a line indicates a comment. Use comments to describe your configuration.
    input {
        file {
            path => "/u02/app/logstash-tutorial-dataset.log"
            start_position => beginning
        }
    }
    filter {
        grok {
            match => { "message" => "%{COMBINEDAPACHELOG}"}
        }
        geoip {
            source => "clientip"
        }
    }
    output {
        elasticsearch {
         hosts => ["localhost:9200"]
        }
        stdout {
         codec => rubydebug
        }
    }
sirishg@sirishg-vm:/u02/app/kibana-4.3.1-linux-x86/bin$ ./kibana 
  log   [18:18:36.697] [info][status][plugin:kibana] Status changed from uninitialized to green - Ready
  log   [18:18:36.786] [info][status][plugin:elasticsearch] Status changed from uninitialized to yellow - Waiting for Elasticsearch
  log   [18:18:36.852] [info][status][plugin:kbn_vislib_vis_types] Status changed from uninitialized to green - Ready
  log   [18:18:36.875] [info][status][plugin:markdown_vis] Status changed from uninitialized to green - Ready
  log   [18:18:36.883] [info][status][plugin:metric_vis] Status changed from uninitialized to green - Ready
  log   [18:18:36.907] [info][status][plugin:spyModes] Status changed from uninitialized to green - Ready
  log   [18:18:36.936] [info][status][plugin:statusPage] Status changed from uninitialized to green - Ready
  log   [18:18:36.950] [info][status][plugin:table_vis] Status changed from uninitialized to green - Ready
  log   [18:18:37.078] [info][listening] Server running at http://0.0.0.0:5601
  log   [18:18:37.446] [info][status][plugin:elasticsearch] Status changed from yellow to green - Kibana index ready
Error: Please specify a default index pattern
KbnError@http://localhost:5601/bundles/commons.bundle.js:58172:21
NoDefaultIndexPattern@http://localhost:5601/bundles/commons.bundle.js:58325:6
loadDefaultIndexPattern/<@http://localhost:5601/bundles/kibana.bundle.js:97911:1
processQueue@http://localhost:5601/bundles/commons.bundle.js:42358:29
scheduleProcessQueue/<@http://localhost:5601/bundles/commons.bundle.js:42374:28
$RootScopeProvider/this.$get</Scope.prototype.$eval@http://localhost:5601/bundles/commons.bundle.js:43602:17
$RootScopeProvider/this.$get</Scope.prototype.$digest@http://localhost:5601/bundles/commons.bundle.js:43413:16
$RootScopeProvider/this.$get</Scope.prototype.$apply@http://localhost:5601/bundles/commons.bundle.js:43710:14
$LocationProvider/this.$get</<@http://localhost:5601/bundles/commons.bundle.js:39839:14
jQuery.event.dispatch@http://localhost:5601/bundles/commons.bundle.js:22720:16
jQuery.event.add/elemData.handle@http://localhost:5601/bundles/commons.bundle.js:22407:7
  {:timestamp=>"2016-01-17T11:07:06.287000-0500", :message=>"Reading config file", :config_file=>"/u02/app/logstash-2.1.1/first-pipeline.conf", :level=>:debug, :file=>"logstash/agent.rb", :line=>"325", :method=>"local_config"}
{:timestamp=>"2016-01-17T11:07:06.420000-0500", :message=>"Compiled pipeline code:\n        @inputs = []\n        @filters = []\n        @outputs = []\n        @periodic_flushers = []\n        @shutdown_flushers = []\n\n          @input_file_1 = plugin(\"input\", \"file\", LogStash::Util.hash_merge_many({ \"path\" => (\"/u02/app/logstash-tutorial-dataset.log\") }, { \"start_position\" => (\"beginning\") }))\n\n          @inputs << @input_file_1\n\n          @filter_grok_2 = plugin(\"filter\", \"grok\", LogStash::Util.hash_merge_many({ \"match\" => {(\"message\") => (\"%{COMBINEDAPACHELOG}\")} }))\n\n          @filters << @filter_grok_2\n\n            @filter_grok_2_flush = lambda do |options, &block|\n              @logger.debug? && @logger.debug(\"Flushing\", :plugin => @filter_grok_2)\n\n              events = @filter_grok_2.flush(options)\n\n              return if events.nil? || events.empty?\n\n              @logger.debug? && @logger.debug(\"Flushing\", :plugin => @filter_grok_2, :events => events)\n\n                          events = @filter_geoip_3.multi_filter(events)\n  \n\n\n              events.each{|e| block.call(e)}\n            end\n\n            if @filter_grok_2.respond_to?(:flush)\n              @periodic_flushers << @filter_grok_2_flush if @filter_grok_2.periodic_flush\n              @shutdown_flushers << @filter_grok_2_flush\n            end\n\n          @filter_geoip_3 = plugin(\"filter\", \"geoip\", LogStash::Util.hash_merge_many({ \"source\" => (\"clientip\") }))\n\n          @filters << @filter_geoip_3\n\n            @filter_geoip_3_flush = lambda do |options, &block|\n              @logger.debug? && @logger.debug(\"Flushing\", :plugin => @filter_geoip_3)\n\n              events = @filter_geoip_3.flush(options)\n\n              return if events.nil? || events.empty?\n\n              @logger.debug? && @logger.debug(\"Flushing\", :plugin => @filter_geoip_3, :events => events)\n\n                \n\n              events.each{|e| block.call(e)}\n            end\n\n            if @filter_geoip_3.respond_to?(:flush)\n              @periodic_flushers << @filter_geoip_3_flush if @filter_geoip_3.periodic_flush\n              @shutdown_flushers << @filter_geoip_3_flush\n            end\n\n          @output_elasticsearch_4 = plugin(\"output\", \"elasticsearch\", LogStash::Util.hash_merge_many({ \"hosts\" => [(\"localhost:9200\")] }))\n\n          @outputs << @output_elasticsearch_4\n\n          @output_stdout_5 = plugin(\"output\", \"stdout\", LogStash::Util.hash_merge_many({ \"codec\" => (\"rubydebug\") }))\n\n          @outputs << @output_stdout_5\n\n  def filter_func(event)\n    events = [event]\n    @logger.debug? && @logger.debug(\"filter received\", :event => event.to_hash)\n              events = @filter_grok_2.multi_filter(events)\n              events = @filter_geoip_3.multi_filter(events)\n    \n    events\n  end\n  def output_func(event)\n    @logger.debug? && @logger.debug(\"output received\", :event => event.to_hash)\n    @output_elasticsearch_4.handle(event)\n    @output_stdout_5.handle(event)\n    \n  end", :level=>:debug, :file=>"logstash/pipeline.rb", :line=>"38", :method=>"initialize"}
{:timestamp=>"2016-01-17T11:07:06.426000-0500", :message=>"Plugin not defined in namespace, checking for plugin file", :type=>"input", :name=>"file", :path=>"logstash/inputs/file", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"}
{:timestamp=>"2016-01-17T11:07:06.451000-0500", :message=>"Plugin not defined in namespace, checking for plugin file", :type=>"codec", :name=>"plain", :path=>"logstash/codecs/plain", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"}
{:timestamp=>"2016-01-17T11:07:06.465000-0500", :message=>"config LogStash::Codecs::Plain/@charset = \"UTF-8\"", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.468000-0500", :message=>"config LogStash::Inputs::File/@path = [\"/u02/app/logstash-tutorial-dataset.log\"]", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.469000-0500", :message=>"config LogStash::Inputs::File/@start_position = \"beginning\"", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.472000-0500", :message=>"config LogStash::Inputs::File/@codec = <LogStash::Codecs::Plain charset=>\"UTF-8\">", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.480000-0500", :message=>"config LogStash::Inputs::File/@add_field = {}", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.481000-0500", :message=>"config LogStash::Inputs::File/@stat_interval = 1", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.492000-0500", :message=>"config LogStash::Inputs::File/@discover_interval = 15", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.493000-0500", :message=>"config LogStash::Inputs::File/@sincedb_write_interval = 15", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.496000-0500", :message=>"config LogStash::Inputs::File/@delimiter = \"\\n\"", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.498000-0500", :message=>"Plugin not defined in namespace, checking for plugin file", :type=>"filter", :name=>"grok", :path=>"logstash/filters/grok", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"}
{:timestamp=>"2016-01-17T11:07:06.515000-0500", :message=>"config LogStash::Filters::Grok/@match = {\"message\"=>\"%{COMBINEDAPACHELOG}\"}", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.524000-0500", :message=>"config LogStash::Filters::Grok/@add_tag = []", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.532000-0500", :message=>"config LogStash::Filters::Grok/@remove_tag = []", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.535000-0500", :message=>"config LogStash::Filters::Grok/@add_field = {}", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.536000-0500", :message=>"config LogStash::Filters::Grok/@remove_field = []", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
    sirishg@sirishg-vm:/u02/app/elasticsearch-2.1.1/bin$ ./elasticsearch
[2016-01-17 11:00:23,467][INFO ][node                     ] [node-1] version[2.1.1], pid[3418], build[40e2c53/2015-12-15T13:05:55Z]
[2016-01-17 11:00:23,470][INFO ][node                     ] [node-1] initializing ...
[2016-01-17 11:00:23,698][INFO ][plugins                  ] [node-1] loaded [], sites []
[2016-01-17 11:00:23,853][INFO ][env                      ] [node-1] using [1] data paths, mounts [[/ (/dev/sda1)]], net usable_space [12.6gb], net total_space [45.1gb], spins? [possibly], types [ext4]
[2016-01-17 11:00:27,412][INFO ][node                     ] [node-1] initialized
[2016-01-17 11:00:27,412][INFO ][node                     ] [node-1] starting ...
[2016-01-17 11:00:27,605][INFO ][transport                ] [node-1] publish_address {localhost/127.0.0.1:9300}, bound_addresses {127.0.0.1:9300}
[2016-01-17 11:00:27,616][INFO ][discovery                ] [node-1] my-application/rd4S1ZOdQXOj3_g-N22NnQ
[2016-01-17 11:00:31,121][INFO ][cluster.service          ] [node-1] new_master {node-1}{rd4S1ZOdQXOj3_g-N22NnQ}{127.0.0.1}{localhost/127.0.0.1:9300}, reason: zen-disco-join(elected_as_master, [0] joins received)
[2016-01-17 11:00:31,259][INFO ][http                     ] [node-1] publish_address {localhost/127.0.0.1:9200}, bound_addresses {127.0.0.1:9200}
[2016-01-17 11:00:31,260][INFO ][node                     ] [node-1] started
[2016-01-17 11:00:31,830][INFO ][gateway                  ] [node-1] recovered [2] indices into cluster_state
弹性搜索设置:

sirishg@sirishg-vm:/u02/app/logstash-2.1.1/bin$ ./logstash -f /u02/app/logstash-2.1.1/first-pipeline.conf 
Settings: Default filter workers: 1
Logstash startup completed
        # The #  character at the beginning of a line indicates a comment. Use comments to describe your configuration.
    input {
        file {
            path => "/u02/app/logstash-tutorial-dataset.log"
            start_position => beginning
        }
    }
    filter {
        grok {
            match => { "message" => "%{COMBINEDAPACHELOG}"}
        }
        geoip {
            source => "clientip"
        }
    }
    output {
        elasticsearch {
         hosts => ["localhost:9200"]
        }
        stdout {
         codec => rubydebug
        }
    }
sirishg@sirishg-vm:/u02/app/kibana-4.3.1-linux-x86/bin$ ./kibana 
  log   [18:18:36.697] [info][status][plugin:kibana] Status changed from uninitialized to green - Ready
  log   [18:18:36.786] [info][status][plugin:elasticsearch] Status changed from uninitialized to yellow - Waiting for Elasticsearch
  log   [18:18:36.852] [info][status][plugin:kbn_vislib_vis_types] Status changed from uninitialized to green - Ready
  log   [18:18:36.875] [info][status][plugin:markdown_vis] Status changed from uninitialized to green - Ready
  log   [18:18:36.883] [info][status][plugin:metric_vis] Status changed from uninitialized to green - Ready
  log   [18:18:36.907] [info][status][plugin:spyModes] Status changed from uninitialized to green - Ready
  log   [18:18:36.936] [info][status][plugin:statusPage] Status changed from uninitialized to green - Ready
  log   [18:18:36.950] [info][status][plugin:table_vis] Status changed from uninitialized to green - Ready
  log   [18:18:37.078] [info][listening] Server running at http://0.0.0.0:5601
  log   [18:18:37.446] [info][status][plugin:elasticsearch] Status changed from yellow to green - Kibana index ready
Error: Please specify a default index pattern
KbnError@http://localhost:5601/bundles/commons.bundle.js:58172:21
NoDefaultIndexPattern@http://localhost:5601/bundles/commons.bundle.js:58325:6
loadDefaultIndexPattern/<@http://localhost:5601/bundles/kibana.bundle.js:97911:1
processQueue@http://localhost:5601/bundles/commons.bundle.js:42358:29
scheduleProcessQueue/<@http://localhost:5601/bundles/commons.bundle.js:42374:28
$RootScopeProvider/this.$get</Scope.prototype.$eval@http://localhost:5601/bundles/commons.bundle.js:43602:17
$RootScopeProvider/this.$get</Scope.prototype.$digest@http://localhost:5601/bundles/commons.bundle.js:43413:16
$RootScopeProvider/this.$get</Scope.prototype.$apply@http://localhost:5601/bundles/commons.bundle.js:43710:14
$LocationProvider/this.$get</<@http://localhost:5601/bundles/commons.bundle.js:39839:14
jQuery.event.dispatch@http://localhost:5601/bundles/commons.bundle.js:22720:16
jQuery.event.add/elemData.handle@http://localhost:5601/bundles/commons.bundle.js:22407:7
  {:timestamp=>"2016-01-17T11:07:06.287000-0500", :message=>"Reading config file", :config_file=>"/u02/app/logstash-2.1.1/first-pipeline.conf", :level=>:debug, :file=>"logstash/agent.rb", :line=>"325", :method=>"local_config"}
{:timestamp=>"2016-01-17T11:07:06.420000-0500", :message=>"Compiled pipeline code:\n        @inputs = []\n        @filters = []\n        @outputs = []\n        @periodic_flushers = []\n        @shutdown_flushers = []\n\n          @input_file_1 = plugin(\"input\", \"file\", LogStash::Util.hash_merge_many({ \"path\" => (\"/u02/app/logstash-tutorial-dataset.log\") }, { \"start_position\" => (\"beginning\") }))\n\n          @inputs << @input_file_1\n\n          @filter_grok_2 = plugin(\"filter\", \"grok\", LogStash::Util.hash_merge_many({ \"match\" => {(\"message\") => (\"%{COMBINEDAPACHELOG}\")} }))\n\n          @filters << @filter_grok_2\n\n            @filter_grok_2_flush = lambda do |options, &block|\n              @logger.debug? && @logger.debug(\"Flushing\", :plugin => @filter_grok_2)\n\n              events = @filter_grok_2.flush(options)\n\n              return if events.nil? || events.empty?\n\n              @logger.debug? && @logger.debug(\"Flushing\", :plugin => @filter_grok_2, :events => events)\n\n                          events = @filter_geoip_3.multi_filter(events)\n  \n\n\n              events.each{|e| block.call(e)}\n            end\n\n            if @filter_grok_2.respond_to?(:flush)\n              @periodic_flushers << @filter_grok_2_flush if @filter_grok_2.periodic_flush\n              @shutdown_flushers << @filter_grok_2_flush\n            end\n\n          @filter_geoip_3 = plugin(\"filter\", \"geoip\", LogStash::Util.hash_merge_many({ \"source\" => (\"clientip\") }))\n\n          @filters << @filter_geoip_3\n\n            @filter_geoip_3_flush = lambda do |options, &block|\n              @logger.debug? && @logger.debug(\"Flushing\", :plugin => @filter_geoip_3)\n\n              events = @filter_geoip_3.flush(options)\n\n              return if events.nil? || events.empty?\n\n              @logger.debug? && @logger.debug(\"Flushing\", :plugin => @filter_geoip_3, :events => events)\n\n                \n\n              events.each{|e| block.call(e)}\n            end\n\n            if @filter_geoip_3.respond_to?(:flush)\n              @periodic_flushers << @filter_geoip_3_flush if @filter_geoip_3.periodic_flush\n              @shutdown_flushers << @filter_geoip_3_flush\n            end\n\n          @output_elasticsearch_4 = plugin(\"output\", \"elasticsearch\", LogStash::Util.hash_merge_many({ \"hosts\" => [(\"localhost:9200\")] }))\n\n          @outputs << @output_elasticsearch_4\n\n          @output_stdout_5 = plugin(\"output\", \"stdout\", LogStash::Util.hash_merge_many({ \"codec\" => (\"rubydebug\") }))\n\n          @outputs << @output_stdout_5\n\n  def filter_func(event)\n    events = [event]\n    @logger.debug? && @logger.debug(\"filter received\", :event => event.to_hash)\n              events = @filter_grok_2.multi_filter(events)\n              events = @filter_geoip_3.multi_filter(events)\n    \n    events\n  end\n  def output_func(event)\n    @logger.debug? && @logger.debug(\"output received\", :event => event.to_hash)\n    @output_elasticsearch_4.handle(event)\n    @output_stdout_5.handle(event)\n    \n  end", :level=>:debug, :file=>"logstash/pipeline.rb", :line=>"38", :method=>"initialize"}
{:timestamp=>"2016-01-17T11:07:06.426000-0500", :message=>"Plugin not defined in namespace, checking for plugin file", :type=>"input", :name=>"file", :path=>"logstash/inputs/file", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"}
{:timestamp=>"2016-01-17T11:07:06.451000-0500", :message=>"Plugin not defined in namespace, checking for plugin file", :type=>"codec", :name=>"plain", :path=>"logstash/codecs/plain", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"}
{:timestamp=>"2016-01-17T11:07:06.465000-0500", :message=>"config LogStash::Codecs::Plain/@charset = \"UTF-8\"", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.468000-0500", :message=>"config LogStash::Inputs::File/@path = [\"/u02/app/logstash-tutorial-dataset.log\"]", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.469000-0500", :message=>"config LogStash::Inputs::File/@start_position = \"beginning\"", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.472000-0500", :message=>"config LogStash::Inputs::File/@codec = <LogStash::Codecs::Plain charset=>\"UTF-8\">", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.480000-0500", :message=>"config LogStash::Inputs::File/@add_field = {}", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.481000-0500", :message=>"config LogStash::Inputs::File/@stat_interval = 1", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.492000-0500", :message=>"config LogStash::Inputs::File/@discover_interval = 15", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.493000-0500", :message=>"config LogStash::Inputs::File/@sincedb_write_interval = 15", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.496000-0500", :message=>"config LogStash::Inputs::File/@delimiter = \"\\n\"", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.498000-0500", :message=>"Plugin not defined in namespace, checking for plugin file", :type=>"filter", :name=>"grok", :path=>"logstash/filters/grok", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"}
{:timestamp=>"2016-01-17T11:07:06.515000-0500", :message=>"config LogStash::Filters::Grok/@match = {\"message\"=>\"%{COMBINEDAPACHELOG}\"}", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.524000-0500", :message=>"config LogStash::Filters::Grok/@add_tag = []", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.532000-0500", :message=>"config LogStash::Filters::Grok/@remove_tag = []", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.535000-0500", :message=>"config LogStash::Filters::Grok/@add_field = {}", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.536000-0500", :message=>"config LogStash::Filters::Grok/@remove_field = []", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
    sirishg@sirishg-vm:/u02/app/elasticsearch-2.1.1/bin$ ./elasticsearch
[2016-01-17 11:00:23,467][INFO ][node                     ] [node-1] version[2.1.1], pid[3418], build[40e2c53/2015-12-15T13:05:55Z]
[2016-01-17 11:00:23,470][INFO ][node                     ] [node-1] initializing ...
[2016-01-17 11:00:23,698][INFO ][plugins                  ] [node-1] loaded [], sites []
[2016-01-17 11:00:23,853][INFO ][env                      ] [node-1] using [1] data paths, mounts [[/ (/dev/sda1)]], net usable_space [12.6gb], net total_space [45.1gb], spins? [possibly], types [ext4]
[2016-01-17 11:00:27,412][INFO ][node                     ] [node-1] initialized
[2016-01-17 11:00:27,412][INFO ][node                     ] [node-1] starting ...
[2016-01-17 11:00:27,605][INFO ][transport                ] [node-1] publish_address {localhost/127.0.0.1:9300}, bound_addresses {127.0.0.1:9300}
[2016-01-17 11:00:27,616][INFO ][discovery                ] [node-1] my-application/rd4S1ZOdQXOj3_g-N22NnQ
[2016-01-17 11:00:31,121][INFO ][cluster.service          ] [node-1] new_master {node-1}{rd4S1ZOdQXOj3_g-N22NnQ}{127.0.0.1}{localhost/127.0.0.1:9300}, reason: zen-disco-join(elected_as_master, [0] joins received)
[2016-01-17 11:00:31,259][INFO ][http                     ] [node-1] publish_address {localhost/127.0.0.1:9200}, bound_addresses {127.0.0.1:9200}
[2016-01-17 11:00:31,260][INFO ][node                     ] [node-1] started
[2016-01-17 11:00:31,830][INFO ][gateway                  ] [node-1] recovered [2] indices into cluster_state
健康检查报告:

{"cluster_name":"my-application","status":"yellow","timed_out":false,"number_of_nodes":1,"number_of_data_nodes":1,"active_primary_shards":1,"active_shards":1,"relocating_shards":0,"initializing_shards":0,"unassigned_shards":1,"delayed_unassigned_shards":0,"number_of_pending_tasks":0,"number_of_in_flight_fetch":0,"task_max_waiting_in_queue_millis":0,"active_shards_percent_as_number":50.0}
启动日志:

sirishg@sirishg-vm:/u02/app/elasticsearch-2.1.1/bin$ ./elasticsearch
[2016-01-16 18:17:36,591][INFO ][node                     ] [node-1] version[2.1.1], pid[3596], build[40e2c53/2015-12-15T13:05:55Z]
[2016-01-16 18:17:36,594][INFO ][node                     ] [node-1] initializing ...
[2016-01-16 18:17:36,798][INFO ][plugins                  ] [node-1] loaded [], sites []
[2016-01-16 18:17:36,907][INFO ][env                      ] [node-1] using [1] data paths, mounts [[/ (/dev/sda1)]], net usable_space [12.6gb], net total_space [45.1gb], spins? [possibly], types [ext4]
[2016-01-16 18:17:43,349][INFO ][node                     ] [node-1] initialized
[2016-01-16 18:17:43,350][INFO ][node                     ] [node-1] starting ...
[2016-01-16 18:17:43,693][INFO ][transport                ] [node-1] publish_address {localhost/127.0.0.1:9300}, bound_addresses {127.0.0.1:9300}
[2016-01-16 18:17:43,713][INFO ][discovery                ] [node-1] my-application/8bfTdwZcSzaNC9_P2VYYvw
[2016-01-16 18:17:46,878][INFO ][cluster.service          ] [node-1] new_master {node-1}{8bfTdwZcSzaNC9_P2VYYvw}{127.0.0.1}{localhost/127.0.0.1:9300}, reason: zen-disco-join(elected_as_master, [0] joins received)
[2016-01-16 18:17:46,980][INFO ][http                     ] [node-1] publish_address {localhost/127.0.0.1:9200}, bound_addresses {127.0.0.1:9200}
[2016-01-16 18:17:46,991][INFO ][node                     ] [node-1] started
[2016-01-16 18:17:47,318][INFO ][gateway                  ] [node-1] recovered [1] indices into cluster_state
[2016-01-16 18:20:03,866][INFO ][rest.suppressed          ] /logstash-*/_mapping/field/* Params: {ignore_unavailable=false, allow_no_indices=false, index=logstash-*, include_defaults=true, fields=*, _=1452986403826}
[logstash-*] IndexNotFoundException[no such index]
    at org.elasticsearch.cluster.metadata.IndexNameExpressionResolver$WildcardExpressionResolver.resolve(IndexNameExpressionResolver.java:636)
    at org.elasticsearch.cluster.metadata.IndexNameExpressionResolver.concreteIndices(IndexNameExpressionResolver.java:133)
    at org.elasticsearch.cluster.metadata.IndexNameExpressionResolver.concreteIndices(IndexNameExpressionResolver.java:77)
    at org.elasticsearch.action.admin.indices.mapping.get.TransportGetFieldMappingsAction.doExecute(TransportGetFieldMappingsAction.java:57)
    at org.elasticsearch.action.admin.indices.mapping.get.TransportGetFieldMappingsAction.doExecute(TransportGetFieldMappingsAction.java:40)
    at org.elasticsearch.action.support.TransportAction.execute(TransportAction.java:70)
    at org.elasticsearch.client.node.NodeClient.doExecute(NodeClient.java:58)
Kibana状态:

sirishg@sirishg-vm:/u02/app/logstash-2.1.1/bin$ ./logstash -f /u02/app/logstash-2.1.1/first-pipeline.conf 
Settings: Default filter workers: 1
Logstash startup completed
        # The #  character at the beginning of a line indicates a comment. Use comments to describe your configuration.
    input {
        file {
            path => "/u02/app/logstash-tutorial-dataset.log"
            start_position => beginning
        }
    }
    filter {
        grok {
            match => { "message" => "%{COMBINEDAPACHELOG}"}
        }
        geoip {
            source => "clientip"
        }
    }
    output {
        elasticsearch {
         hosts => ["localhost:9200"]
        }
        stdout {
         codec => rubydebug
        }
    }
sirishg@sirishg-vm:/u02/app/kibana-4.3.1-linux-x86/bin$ ./kibana 
  log   [18:18:36.697] [info][status][plugin:kibana] Status changed from uninitialized to green - Ready
  log   [18:18:36.786] [info][status][plugin:elasticsearch] Status changed from uninitialized to yellow - Waiting for Elasticsearch
  log   [18:18:36.852] [info][status][plugin:kbn_vislib_vis_types] Status changed from uninitialized to green - Ready
  log   [18:18:36.875] [info][status][plugin:markdown_vis] Status changed from uninitialized to green - Ready
  log   [18:18:36.883] [info][status][plugin:metric_vis] Status changed from uninitialized to green - Ready
  log   [18:18:36.907] [info][status][plugin:spyModes] Status changed from uninitialized to green - Ready
  log   [18:18:36.936] [info][status][plugin:statusPage] Status changed from uninitialized to green - Ready
  log   [18:18:36.950] [info][status][plugin:table_vis] Status changed from uninitialized to green - Ready
  log   [18:18:37.078] [info][listening] Server running at http://0.0.0.0:5601
  log   [18:18:37.446] [info][status][plugin:elasticsearch] Status changed from yellow to green - Kibana index ready
Error: Please specify a default index pattern
KbnError@http://localhost:5601/bundles/commons.bundle.js:58172:21
NoDefaultIndexPattern@http://localhost:5601/bundles/commons.bundle.js:58325:6
loadDefaultIndexPattern/<@http://localhost:5601/bundles/kibana.bundle.js:97911:1
processQueue@http://localhost:5601/bundles/commons.bundle.js:42358:29
scheduleProcessQueue/<@http://localhost:5601/bundles/commons.bundle.js:42374:28
$RootScopeProvider/this.$get</Scope.prototype.$eval@http://localhost:5601/bundles/commons.bundle.js:43602:17
$RootScopeProvider/this.$get</Scope.prototype.$digest@http://localhost:5601/bundles/commons.bundle.js:43413:16
$RootScopeProvider/this.$get</Scope.prototype.$apply@http://localhost:5601/bundles/commons.bundle.js:43710:14
$LocationProvider/this.$get</<@http://localhost:5601/bundles/commons.bundle.js:39839:14
jQuery.event.dispatch@http://localhost:5601/bundles/commons.bundle.js:22720:16
jQuery.event.add/elemData.handle@http://localhost:5601/bundles/commons.bundle.js:22407:7
  {:timestamp=>"2016-01-17T11:07:06.287000-0500", :message=>"Reading config file", :config_file=>"/u02/app/logstash-2.1.1/first-pipeline.conf", :level=>:debug, :file=>"logstash/agent.rb", :line=>"325", :method=>"local_config"}
{:timestamp=>"2016-01-17T11:07:06.420000-0500", :message=>"Compiled pipeline code:\n        @inputs = []\n        @filters = []\n        @outputs = []\n        @periodic_flushers = []\n        @shutdown_flushers = []\n\n          @input_file_1 = plugin(\"input\", \"file\", LogStash::Util.hash_merge_many({ \"path\" => (\"/u02/app/logstash-tutorial-dataset.log\") }, { \"start_position\" => (\"beginning\") }))\n\n          @inputs << @input_file_1\n\n          @filter_grok_2 = plugin(\"filter\", \"grok\", LogStash::Util.hash_merge_many({ \"match\" => {(\"message\") => (\"%{COMBINEDAPACHELOG}\")} }))\n\n          @filters << @filter_grok_2\n\n            @filter_grok_2_flush = lambda do |options, &block|\n              @logger.debug? && @logger.debug(\"Flushing\", :plugin => @filter_grok_2)\n\n              events = @filter_grok_2.flush(options)\n\n              return if events.nil? || events.empty?\n\n              @logger.debug? && @logger.debug(\"Flushing\", :plugin => @filter_grok_2, :events => events)\n\n                          events = @filter_geoip_3.multi_filter(events)\n  \n\n\n              events.each{|e| block.call(e)}\n            end\n\n            if @filter_grok_2.respond_to?(:flush)\n              @periodic_flushers << @filter_grok_2_flush if @filter_grok_2.periodic_flush\n              @shutdown_flushers << @filter_grok_2_flush\n            end\n\n          @filter_geoip_3 = plugin(\"filter\", \"geoip\", LogStash::Util.hash_merge_many({ \"source\" => (\"clientip\") }))\n\n          @filters << @filter_geoip_3\n\n            @filter_geoip_3_flush = lambda do |options, &block|\n              @logger.debug? && @logger.debug(\"Flushing\", :plugin => @filter_geoip_3)\n\n              events = @filter_geoip_3.flush(options)\n\n              return if events.nil? || events.empty?\n\n              @logger.debug? && @logger.debug(\"Flushing\", :plugin => @filter_geoip_3, :events => events)\n\n                \n\n              events.each{|e| block.call(e)}\n            end\n\n            if @filter_geoip_3.respond_to?(:flush)\n              @periodic_flushers << @filter_geoip_3_flush if @filter_geoip_3.periodic_flush\n              @shutdown_flushers << @filter_geoip_3_flush\n            end\n\n          @output_elasticsearch_4 = plugin(\"output\", \"elasticsearch\", LogStash::Util.hash_merge_many({ \"hosts\" => [(\"localhost:9200\")] }))\n\n          @outputs << @output_elasticsearch_4\n\n          @output_stdout_5 = plugin(\"output\", \"stdout\", LogStash::Util.hash_merge_many({ \"codec\" => (\"rubydebug\") }))\n\n          @outputs << @output_stdout_5\n\n  def filter_func(event)\n    events = [event]\n    @logger.debug? && @logger.debug(\"filter received\", :event => event.to_hash)\n              events = @filter_grok_2.multi_filter(events)\n              events = @filter_geoip_3.multi_filter(events)\n    \n    events\n  end\n  def output_func(event)\n    @logger.debug? && @logger.debug(\"output received\", :event => event.to_hash)\n    @output_elasticsearch_4.handle(event)\n    @output_stdout_5.handle(event)\n    \n  end", :level=>:debug, :file=>"logstash/pipeline.rb", :line=>"38", :method=>"initialize"}
{:timestamp=>"2016-01-17T11:07:06.426000-0500", :message=>"Plugin not defined in namespace, checking for plugin file", :type=>"input", :name=>"file", :path=>"logstash/inputs/file", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"}
{:timestamp=>"2016-01-17T11:07:06.451000-0500", :message=>"Plugin not defined in namespace, checking for plugin file", :type=>"codec", :name=>"plain", :path=>"logstash/codecs/plain", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"}
{:timestamp=>"2016-01-17T11:07:06.465000-0500", :message=>"config LogStash::Codecs::Plain/@charset = \"UTF-8\"", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.468000-0500", :message=>"config LogStash::Inputs::File/@path = [\"/u02/app/logstash-tutorial-dataset.log\"]", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.469000-0500", :message=>"config LogStash::Inputs::File/@start_position = \"beginning\"", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.472000-0500", :message=>"config LogStash::Inputs::File/@codec = <LogStash::Codecs::Plain charset=>\"UTF-8\">", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.480000-0500", :message=>"config LogStash::Inputs::File/@add_field = {}", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.481000-0500", :message=>"config LogStash::Inputs::File/@stat_interval = 1", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.492000-0500", :message=>"config LogStash::Inputs::File/@discover_interval = 15", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.493000-0500", :message=>"config LogStash::Inputs::File/@sincedb_write_interval = 15", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.496000-0500", :message=>"config LogStash::Inputs::File/@delimiter = \"\\n\"", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.498000-0500", :message=>"Plugin not defined in namespace, checking for plugin file", :type=>"filter", :name=>"grok", :path=>"logstash/filters/grok", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"}
{:timestamp=>"2016-01-17T11:07:06.515000-0500", :message=>"config LogStash::Filters::Grok/@match = {\"message\"=>\"%{COMBINEDAPACHELOG}\"}", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.524000-0500", :message=>"config LogStash::Filters::Grok/@add_tag = []", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.532000-0500", :message=>"config LogStash::Filters::Grok/@remove_tag = []", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.535000-0500", :message=>"config LogStash::Filters::Grok/@add_field = {}", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.536000-0500", :message=>"config LogStash::Filters::Grok/@remove_field = []", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
    sirishg@sirishg-vm:/u02/app/elasticsearch-2.1.1/bin$ ./elasticsearch
[2016-01-17 11:00:23,467][INFO ][node                     ] [node-1] version[2.1.1], pid[3418], build[40e2c53/2015-12-15T13:05:55Z]
[2016-01-17 11:00:23,470][INFO ][node                     ] [node-1] initializing ...
[2016-01-17 11:00:23,698][INFO ][plugins                  ] [node-1] loaded [], sites []
[2016-01-17 11:00:23,853][INFO ][env                      ] [node-1] using [1] data paths, mounts [[/ (/dev/sda1)]], net usable_space [12.6gb], net total_space [45.1gb], spins? [possibly], types [ext4]
[2016-01-17 11:00:27,412][INFO ][node                     ] [node-1] initialized
[2016-01-17 11:00:27,412][INFO ][node                     ] [node-1] starting ...
[2016-01-17 11:00:27,605][INFO ][transport                ] [node-1] publish_address {localhost/127.0.0.1:9300}, bound_addresses {127.0.0.1:9300}
[2016-01-17 11:00:27,616][INFO ][discovery                ] [node-1] my-application/rd4S1ZOdQXOj3_g-N22NnQ
[2016-01-17 11:00:31,121][INFO ][cluster.service          ] [node-1] new_master {node-1}{rd4S1ZOdQXOj3_g-N22NnQ}{127.0.0.1}{localhost/127.0.0.1:9300}, reason: zen-disco-join(elected_as_master, [0] joins received)
[2016-01-17 11:00:31,259][INFO ][http                     ] [node-1] publish_address {localhost/127.0.0.1:9200}, bound_addresses {127.0.0.1:9200}
[2016-01-17 11:00:31,260][INFO ][node                     ] [node-1] started
[2016-01-17 11:00:31,830][INFO ][gateway                  ] [node-1] recovered [2] indices into cluster_state
Kibana用户界面错误:

sirishg@sirishg-vm:/u02/app/logstash-2.1.1/bin$ ./logstash -f /u02/app/logstash-2.1.1/first-pipeline.conf 
Settings: Default filter workers: 1
Logstash startup completed
        # The #  character at the beginning of a line indicates a comment. Use comments to describe your configuration.
    input {
        file {
            path => "/u02/app/logstash-tutorial-dataset.log"
            start_position => beginning
        }
    }
    filter {
        grok {
            match => { "message" => "%{COMBINEDAPACHELOG}"}
        }
        geoip {
            source => "clientip"
        }
    }
    output {
        elasticsearch {
         hosts => ["localhost:9200"]
        }
        stdout {
         codec => rubydebug
        }
    }
sirishg@sirishg-vm:/u02/app/kibana-4.3.1-linux-x86/bin$ ./kibana 
  log   [18:18:36.697] [info][status][plugin:kibana] Status changed from uninitialized to green - Ready
  log   [18:18:36.786] [info][status][plugin:elasticsearch] Status changed from uninitialized to yellow - Waiting for Elasticsearch
  log   [18:18:36.852] [info][status][plugin:kbn_vislib_vis_types] Status changed from uninitialized to green - Ready
  log   [18:18:36.875] [info][status][plugin:markdown_vis] Status changed from uninitialized to green - Ready
  log   [18:18:36.883] [info][status][plugin:metric_vis] Status changed from uninitialized to green - Ready
  log   [18:18:36.907] [info][status][plugin:spyModes] Status changed from uninitialized to green - Ready
  log   [18:18:36.936] [info][status][plugin:statusPage] Status changed from uninitialized to green - Ready
  log   [18:18:36.950] [info][status][plugin:table_vis] Status changed from uninitialized to green - Ready
  log   [18:18:37.078] [info][listening] Server running at http://0.0.0.0:5601
  log   [18:18:37.446] [info][status][plugin:elasticsearch] Status changed from yellow to green - Kibana index ready
Error: Please specify a default index pattern
KbnError@http://localhost:5601/bundles/commons.bundle.js:58172:21
NoDefaultIndexPattern@http://localhost:5601/bundles/commons.bundle.js:58325:6
loadDefaultIndexPattern/<@http://localhost:5601/bundles/kibana.bundle.js:97911:1
processQueue@http://localhost:5601/bundles/commons.bundle.js:42358:29
scheduleProcessQueue/<@http://localhost:5601/bundles/commons.bundle.js:42374:28
$RootScopeProvider/this.$get</Scope.prototype.$eval@http://localhost:5601/bundles/commons.bundle.js:43602:17
$RootScopeProvider/this.$get</Scope.prototype.$digest@http://localhost:5601/bundles/commons.bundle.js:43413:16
$RootScopeProvider/this.$get</Scope.prototype.$apply@http://localhost:5601/bundles/commons.bundle.js:43710:14
$LocationProvider/this.$get</<@http://localhost:5601/bundles/commons.bundle.js:39839:14
jQuery.event.dispatch@http://localhost:5601/bundles/commons.bundle.js:22720:16
jQuery.event.add/elemData.handle@http://localhost:5601/bundles/commons.bundle.js:22407:7
  {:timestamp=>"2016-01-17T11:07:06.287000-0500", :message=>"Reading config file", :config_file=>"/u02/app/logstash-2.1.1/first-pipeline.conf", :level=>:debug, :file=>"logstash/agent.rb", :line=>"325", :method=>"local_config"}
{:timestamp=>"2016-01-17T11:07:06.420000-0500", :message=>"Compiled pipeline code:\n        @inputs = []\n        @filters = []\n        @outputs = []\n        @periodic_flushers = []\n        @shutdown_flushers = []\n\n          @input_file_1 = plugin(\"input\", \"file\", LogStash::Util.hash_merge_many({ \"path\" => (\"/u02/app/logstash-tutorial-dataset.log\") }, { \"start_position\" => (\"beginning\") }))\n\n          @inputs << @input_file_1\n\n          @filter_grok_2 = plugin(\"filter\", \"grok\", LogStash::Util.hash_merge_many({ \"match\" => {(\"message\") => (\"%{COMBINEDAPACHELOG}\")} }))\n\n          @filters << @filter_grok_2\n\n            @filter_grok_2_flush = lambda do |options, &block|\n              @logger.debug? && @logger.debug(\"Flushing\", :plugin => @filter_grok_2)\n\n              events = @filter_grok_2.flush(options)\n\n              return if events.nil? || events.empty?\n\n              @logger.debug? && @logger.debug(\"Flushing\", :plugin => @filter_grok_2, :events => events)\n\n                          events = @filter_geoip_3.multi_filter(events)\n  \n\n\n              events.each{|e| block.call(e)}\n            end\n\n            if @filter_grok_2.respond_to?(:flush)\n              @periodic_flushers << @filter_grok_2_flush if @filter_grok_2.periodic_flush\n              @shutdown_flushers << @filter_grok_2_flush\n            end\n\n          @filter_geoip_3 = plugin(\"filter\", \"geoip\", LogStash::Util.hash_merge_many({ \"source\" => (\"clientip\") }))\n\n          @filters << @filter_geoip_3\n\n            @filter_geoip_3_flush = lambda do |options, &block|\n              @logger.debug? && @logger.debug(\"Flushing\", :plugin => @filter_geoip_3)\n\n              events = @filter_geoip_3.flush(options)\n\n              return if events.nil? || events.empty?\n\n              @logger.debug? && @logger.debug(\"Flushing\", :plugin => @filter_geoip_3, :events => events)\n\n                \n\n              events.each{|e| block.call(e)}\n            end\n\n            if @filter_geoip_3.respond_to?(:flush)\n              @periodic_flushers << @filter_geoip_3_flush if @filter_geoip_3.periodic_flush\n              @shutdown_flushers << @filter_geoip_3_flush\n            end\n\n          @output_elasticsearch_4 = plugin(\"output\", \"elasticsearch\", LogStash::Util.hash_merge_many({ \"hosts\" => [(\"localhost:9200\")] }))\n\n          @outputs << @output_elasticsearch_4\n\n          @output_stdout_5 = plugin(\"output\", \"stdout\", LogStash::Util.hash_merge_many({ \"codec\" => (\"rubydebug\") }))\n\n          @outputs << @output_stdout_5\n\n  def filter_func(event)\n    events = [event]\n    @logger.debug? && @logger.debug(\"filter received\", :event => event.to_hash)\n              events = @filter_grok_2.multi_filter(events)\n              events = @filter_geoip_3.multi_filter(events)\n    \n    events\n  end\n  def output_func(event)\n    @logger.debug? && @logger.debug(\"output received\", :event => event.to_hash)\n    @output_elasticsearch_4.handle(event)\n    @output_stdout_5.handle(event)\n    \n  end", :level=>:debug, :file=>"logstash/pipeline.rb", :line=>"38", :method=>"initialize"}
{:timestamp=>"2016-01-17T11:07:06.426000-0500", :message=>"Plugin not defined in namespace, checking for plugin file", :type=>"input", :name=>"file", :path=>"logstash/inputs/file", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"}
{:timestamp=>"2016-01-17T11:07:06.451000-0500", :message=>"Plugin not defined in namespace, checking for plugin file", :type=>"codec", :name=>"plain", :path=>"logstash/codecs/plain", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"}
{:timestamp=>"2016-01-17T11:07:06.465000-0500", :message=>"config LogStash::Codecs::Plain/@charset = \"UTF-8\"", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.468000-0500", :message=>"config LogStash::Inputs::File/@path = [\"/u02/app/logstash-tutorial-dataset.log\"]", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.469000-0500", :message=>"config LogStash::Inputs::File/@start_position = \"beginning\"", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.472000-0500", :message=>"config LogStash::Inputs::File/@codec = <LogStash::Codecs::Plain charset=>\"UTF-8\">", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.480000-0500", :message=>"config LogStash::Inputs::File/@add_field = {}", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.481000-0500", :message=>"config LogStash::Inputs::File/@stat_interval = 1", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.492000-0500", :message=>"config LogStash::Inputs::File/@discover_interval = 15", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.493000-0500", :message=>"config LogStash::Inputs::File/@sincedb_write_interval = 15", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.496000-0500", :message=>"config LogStash::Inputs::File/@delimiter = \"\\n\"", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.498000-0500", :message=>"Plugin not defined in namespace, checking for plugin file", :type=>"filter", :name=>"grok", :path=>"logstash/filters/grok", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"}
{:timestamp=>"2016-01-17T11:07:06.515000-0500", :message=>"config LogStash::Filters::Grok/@match = {\"message\"=>\"%{COMBINEDAPACHELOG}\"}", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.524000-0500", :message=>"config LogStash::Filters::Grok/@add_tag = []", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.532000-0500", :message=>"config LogStash::Filters::Grok/@remove_tag = []", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.535000-0500", :message=>"config LogStash::Filters::Grok/@add_field = {}", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.536000-0500", :message=>"config LogStash::Filters::Grok/@remove_field = []", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
    sirishg@sirishg-vm:/u02/app/elasticsearch-2.1.1/bin$ ./elasticsearch
[2016-01-17 11:00:23,467][INFO ][node                     ] [node-1] version[2.1.1], pid[3418], build[40e2c53/2015-12-15T13:05:55Z]
[2016-01-17 11:00:23,470][INFO ][node                     ] [node-1] initializing ...
[2016-01-17 11:00:23,698][INFO ][plugins                  ] [node-1] loaded [], sites []
[2016-01-17 11:00:23,853][INFO ][env                      ] [node-1] using [1] data paths, mounts [[/ (/dev/sda1)]], net usable_space [12.6gb], net total_space [45.1gb], spins? [possibly], types [ext4]
[2016-01-17 11:00:27,412][INFO ][node                     ] [node-1] initialized
[2016-01-17 11:00:27,412][INFO ][node                     ] [node-1] starting ...
[2016-01-17 11:00:27,605][INFO ][transport                ] [node-1] publish_address {localhost/127.0.0.1:9300}, bound_addresses {127.0.0.1:9300}
[2016-01-17 11:00:27,616][INFO ][discovery                ] [node-1] my-application/rd4S1ZOdQXOj3_g-N22NnQ
[2016-01-17 11:00:31,121][INFO ][cluster.service          ] [node-1] new_master {node-1}{rd4S1ZOdQXOj3_g-N22NnQ}{127.0.0.1}{localhost/127.0.0.1:9300}, reason: zen-disco-join(elected_as_master, [0] joins received)
[2016-01-17 11:00:31,259][INFO ][http                     ] [node-1] publish_address {localhost/127.0.0.1:9200}, bound_addresses {127.0.0.1:9200}
[2016-01-17 11:00:31,260][INFO ][node                     ] [node-1] started
[2016-01-17 11:00:31,830][INFO ][gateway                  ] [node-1] recovered [2] indices into cluster_state

你能让它工作吗?一些评论:

1) kibana在“0.0.0.0”上运行的事实有时是出现问题的迹象,请使用elasticsearch检查配置和连接

2) 你把这些信息放在什么索引中?logstash*

3) 如果其他所有操作都失败,请更新到当前的2.3.*(Elasticsearch)和4.4.*(Kibana)

4) 为了让logstash实际捕获并读取文件(从而将数据发送到Elasticsearch),您应该再次写入文件(更改文件创建/修改时间戳)。这一部分并不总是那么容易,因为logstash(文件输入)实际上像一个指向添加到文件的最后一行的指针或其他东西


你现在可能已经开始工作了,所以也许我在冒险,但另一方面,也许这可以帮助别人。

你能让它工作吗?一些评论:

1) kibana在“0.0.0.0”上运行的事实有时是出现问题的迹象,请使用elasticsearch检查配置和连接

2) 你把这些信息放在什么索引中?logstash*

3) 如果其他所有操作都失败,请更新到当前的2.3.*(Elasticsearch)和4.4.*(Kibana)

4) 为了让logstash实际捕获并读取文件(从而将数据发送到Elasticsearch),您应该再次写入文件(更改文件创建/修改时间戳)。这一部分并不总是那么容易,因为logstash(文件输入)实际上像一个指向添加到文件的最后一行的指针或其他东西


您现在可能已经可以使用它了,所以可能我正在冒险,但另一方面,这可能会帮助某些人。

很抱歉忘记提供版本信息,这里是Kibana版本4.3.1、Loststash版本2.1.1和Elasticsearch版本2.1.1 logstash控制台中是否有任何输出?如果你曾经处理过
/u02/app/logstash tutorial dataset.log
,你可能需要删除logstash.sincedb文件(
ls~/.sincedb*
)。你能用
--debug
标志运行logstash并显示一些相关的日志吗?@龙华是的,我找到了一个.sincedb文件,我会按照你说的删除它/home/sirishg/.sincedb_9dbbe1dd488b6b538eecc653ee954022。让我再试一次,我将发布更新。@Val我已在原始帖子中附加了logstash调试日志。敬请建议。很抱歉忘记提供版本信息,这是Kibana版本4.3.1、Loststash版本2.1.1和Elasticsearch版本2.1.1 logstash控制台中是否有任何输出?如果你曾经处理过
/u02/app/logstash tutorial dataset.log
,你可能需要删除logstash.sincedb文件(
ls~/.sincedb*
)。你能用
--debug
标志运行logstash并显示一些相关的日志吗?@龙华是的,我找到了一个.sincedb文件,我会按照你说的删除它/home/sirishg/.sincedb_9dbbe1dd488b6b538eecc653ee954022。让我再试一次,我将发布更新。@Val我已在原始帖子中附加了logstash调试日志。请建议。