Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/facebook/8.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
<img src="//i.stack.imgur.com/RUiNP.png" height="16" width="18" alt="" class="sponsor tag img">elasticsearch 如何从LOGSTASH创建索引并映射到ES_<img Src="//i.stack.imgur.com/RUiNP.png" Height="16" Width="18" Alt="" Class="sponsor Tag Img">elasticsearch_Logstash_Logstash Configuration - Fatal编程技术网 elasticsearch 如何从LOGSTASH创建索引并映射到ES,elasticsearch,logstash,logstash-configuration,elasticsearch,Logstash,Logstash Configuration" /> elasticsearch 如何从LOGSTASH创建索引并映射到ES,elasticsearch,logstash,logstash-configuration,elasticsearch,Logstash,Logstash Configuration" />

elasticsearch 如何从LOGSTASH创建索引并映射到ES

elasticsearch 如何从LOGSTASH创建索引并映射到ES,elasticsearch,logstash,logstash-configuration,elasticsearch,Logstash,Logstash Configuration,我一直在遵循本教程,将数据从DB导入LOGSTASh,创建Idex并映射到弹性搜索 这是基于我的配置文件的输出: [2017-10-12T11:50:45,807][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"C:/Users/Bruno/Downloads/logstash-5.6.2/logstash-5.6.2/modules/fb

我一直在遵循本教程,将数据从DB导入LOGSTASh,创建Idex并映射到弹性搜索

这是基于我的配置文件的输出:

[2017-10-12T11:50:45,807][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"C:/Users/Bruno/Downloads/logstash-5.6.2/logstash-5.6.2/modules/fb_apache/configuration"}
[2017-10-12T11:50:45,812][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"C:/Users/Bruno/Downloads/logstash-5.6.2/logstash-5.6.2/modules/netflow/configuration"}
[2017-10-12T11:50:46,518][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2017-10-12T11:50:46,521][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2017-10-12T11:50:46,652][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2017-10-12T11:50:46,654][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-10-12T11:50:46,716][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-10-12T11:50:46,734][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2017-10-12T11:50:46,749][INFO ][logstash.pipeline        ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2017-10-12T11:50:47,053][INFO ][logstash.pipeline        ] Pipeline main started
[2017-10-12T11:50:47,196][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2017-10-12T11:50:47,817][INFO ][logstash.inputs.jdbc     ] (0.130000s) SELECT * from EP_RDA_STRING
[2017-10-12T11:50:53,095][WARN ][logstash.agent           ] stopping pipeline {:id=>"main"}
至少我认为,一切似乎都很好。除了查询ES服务器以输出索引和映射之外,我将其设置为空

http://localhost:9200/_all/_mapping

{}

http://localhost:9200/_cat/indices?v

health status index uuid pri rep docs.count docs.deleted store.size pri.store.size
这是我的文件配置:

input {
    jdbc {
        # sqlserver jdbc connection string to our database, mydb        
        jdbc_connection_string => "jdbc:sqlserver://localhost:1433;databaseName=RDA; integratedSecurity=true;"
        # The user we wish to execute our statement as
        jdbc_user => ""
        # The path to our downloaded jdbc driver
        jdbc_driver_library => "C:\mypath\sqljdbc_6.2\enu\mssql-jdbc-6.2.1.jre8.jar"
        # The name of the driver class for Postgresql
        jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
        # our query
        statement => "SELECT * from EP_RDA_STRING"
    }
}
output {
    elasticsearch {

        index => "RDA"
        document_type => "RDA_string_view"
        document_id => "%{ndb_no}"
        hosts => "localhost:9200"
    }
}

您使用的是哪个版本的logstash?您用于启动日志存储的命令是什么?确保输入和输出块与下面给出的块相似

input {
    beats {
        port => "29600"
        type => "weblogic-server"
    }
}
filter {
}

output {
    elasticsearch { 
      hosts => ["127.0.0.1:9200"] 
      index => "logstash-%{+YYYY.MM.dd}"
    }
    stdout { codec => rubydebug }
}

您使用的是哪个版本的logstash?您用于启动日志存储的命令是什么?确保输入和输出块与下面给出的块相似

input {
    beats {
        port => "29600"
        type => "weblogic-server"
    }
}
filter {
}

output {
    elasticsearch { 
      hosts => ["127.0.0.1:9200"] 
      index => "logstash-%{+YYYY.MM.dd}"
    }
    stdout { codec => rubydebug }
}

一件事是ES索引必须全部小写(即
rda
而不是
rda
),因此我认为您的ES日志中可能有一个错误告诉您。一件事是ES索引必须全部小写(即
rda
而不是
rda
),所以在我看来,你可能有一个错误,在你的ES日志告诉你。