elasticsearch 日志存储未在Windows 10中创建索引,elasticsearch,logstash,kibana,elastic-stack,logstash-configuration,elasticsearch,Logstash,Kibana,Elastic Stack,Logstash Configuration" /> elasticsearch 日志存储未在Windows 10中创建索引,elasticsearch,logstash,kibana,elastic-stack,logstash-configuration,elasticsearch,Logstash,Kibana,Elastic Stack,Logstash Configuration" />

elasticsearch 日志存储未在Windows 10中创建索引

elasticsearch 日志存储未在Windows 10中创建索引,elasticsearch,logstash,kibana,elastic-stack,logstash-configuration,elasticsearch,Logstash,Kibana,Elastic Stack,Logstash Configuration,我已经使用zip文件启动了logstash、kibana和elasticsearch。我正在将一个csv文件从logstash接收到elastic search input { file { path => "D:\tls202_part01\tls202_part01.csv" start_position => "beginning" }

我已经使用zip文件启动了logstash、kibana和elasticsearch。我正在将一个csv文件从logstash接收到elastic search

           input {
            file {
                path => "D:\tls202_part01\tls202_part01.csv"
                start_position => "beginning"
            }
        }
        filter {
            csv {
                separator => ","
                columns => ["appln_id", "appln_title_lg", "appln_title"]
            }
            mutate {
                convert => ["appln_id", "integer"]
                convert => ["appln_title_lg", "string"]
                convert => ["appln_title", "string"]
            }
        }
        output {
            elasticsearch {
                hosts => "localhost"
                index => "title"
            }
            stdout {
                codec => rubydebug
            }
        }
这是我的配置文件。当我搜索索引标题时,它不在那里,日志存储日志如下:

   Sending Logstash logs to D:/logstash-6.5.4/logs which is now configured via log4j2.properties
[2018-12-26T10:22:35,672][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-12-26T10:22:35,699][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.5.4"}
[2018-12-26T10:22:41,588][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-12-26T10:22:42,051][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-12-26T10:22:42,297][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-12-26T10:22:42,370][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-12-26T10:22:42,376][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-12-26T10:22:42,417][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost"]}
[2018-12-26T10:22:42,439][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-12-26T10:22:42,473][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-12-26T10:22:43,330][INFO ][logstash.inputs.file     ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"D:/logstash-6.5.4/data/plugins/inputs/file/.sincedb_bb5ff7ebd070422c5b611ac87e9e7087", :path=>["D:\\tls202_part01\\tls202_part01.csv"]}
[2018-12-26T10:22:43,390][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x389cc614 run>"}
[2018-12-26T10:22:43,499][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2018-12-26T10:22:43,532][INFO ][filewatch.observingtail  ] START, creating Discoverer, Watch with file and sincedb collections
[2018-12-26T10:22:43,842][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
将Logstash日志发送到D:/Logstash-6.5.4/logs,该日志现在通过log4j2.properties配置
[2018-12-26T10:22:35672][WARN][logstash.config.source.multilocal]忽略“pipelines.yml”文件,因为指定了模块或命令行选项
[2018-12-26T10:22:35699][INFO][logstash.runner]启动logstash{“logstash.version”=>“6.5.4”}
[2018-12-26T10:22:41588][INFO][logstash.pipeline]正在启动管道{:pipeline_id=>“main”,“pipeline.workers”=>8,“pipeline.batch.size”=>125,“pipeline.batch.delay”=>50}
[2018-12-26T10:22:42051][INFO][logstash.outputs.elasticsearch]elasticsearch池URL更新{:更改=>{:删除=>[],:添加=>[http://localhost:9200/]}}
[2018-12-26T10:22:42297][WARN][logstash.outputs.elasticsearch]已恢复与ES实例的连接{:url=>”http://localhost:9200/"}
[2018-12-26T10:22:42370][INFO][logstash.outputs.elasticsearch]ES输出版本已确定{:ES_version=>6}
[2018-12-26T10:22:42376][WARN][logstash.outputs.elasticsearch]检测到一个6.x及以上的集群:“type”事件字段将不用于确定文档类型{:es\u version=>6}
[2018-12-26T10:22:42417][INFO][logstash.outputs.elasticsearch]新的elasticsearch输出{:class=>“logstash::outputs::elasticsearch”,:hosts=>[“//localhost”]}
[2018-12-26T10:22:42439][INFO][logstash.outputs.elasticsearch]使用来自{:path=>nil}的映射模板
[2018-12-26T10:22:42473][INFO][logstash.outputs.elasticsearch]正在尝试安装模板{:管理模板=>{“模板”=>“logstash-*”,“版本”=>60001,“设置”=>{“索引.刷新间隔”=>“5s”},“映射”=>{“默认”=>{“动态模板”=>[{“消息字段”=>{“路径匹配”=>“消息”,“匹配映射类型”=>“字符串”,“映射”>>“类型”=>“文本”、“规范”=>false}}}、{“字符串字段”=>{“匹配”=>“*”、“匹配映射类型”=>“字符串”、“映射”=>{“类型”=>“文本”、“规范”=>false、“字段”=>{“关键字”=>{“类型”=>“关键字”、“忽略”=>256}}}}、“属性”=>{@timestamp”=>{“类型”=>“日期”}、@version”=>“版本”=>“geoip”=>“类型”=>“关键字”=>“动态属性”{>“位置”=>{“类型”=>“地理点”},“纬度”=>{“类型”=>“半浮”},“经度”=>{“类型”=>“半浮”}
[2018-12-26T10:22:43330][INFO][logstash.inputs.file]没有sincedb_路径集,根据“路径”设置生成一个路径集{:sincedb_路径=>“D:/logstash-6.5.4/data/plugins/inputs/file/。sincedb_bb5ff7ebd070422c5b611ac87e9e7087”,:路径=>[“D:\\tls202_part01\\tls202_part01.csv”]
[2018-12-26T10:22:43390][INFO][logstash.pipeline]管道已成功启动{:pipeline_id=>“main”,:thread=>“#”}
[2018-12-26T10:22:43499][INFO][logstash.agent]正在运行的管道{:计数=>1,:正在运行的管道=>[:main],:未运行的管道=>[]
[2018-12-26T10:22:43532][INFO][filewatch.observingtail]开始,创建发现者,使用文件和sincedb集合监视
[2018-12-26T10:22:43842][INFO][logstash.agent]已成功启动logstash API端点{:port=>9600}
CSV文件包含大量2GB CSV数据。
此外,kibana没有显示用于创建索引的Elasticsearch数据。

似乎logstash没有找到您的文件,请将您的路径从反斜杠更改为正斜杠,然后查看它是否有效

path => "D:/tls202_part01/tls202_part01.csv"

尝试对logstash配置文件的小部分进行注释,看看代码是否有效(主要是过滤器)。此外,您还可以在同一配置文件的输入中向文件中添加sincedb_path=>“NUL”。由于已定义了sincedb,因此无法工作。它通过从后斜杠更改为前斜杠来工作。