elasticsearch Logstash未在弹性搜索中创建索引,elasticsearch,windows-10,logstash,kibana,elasticsearch,Windows 10,Logstash,Kibana" /> elasticsearch Logstash未在弹性搜索中创建索引,elasticsearch,windows-10,logstash,kibana,elasticsearch,Windows 10,Logstash,Kibana" />

elasticsearch Logstash未在弹性搜索中创建索引

elasticsearch Logstash未在弹性搜索中创建索引,elasticsearch,windows-10,logstash,kibana,elasticsearch,Windows 10,Logstash,Kibana,它在控制台上没有显示任何错误。下面是我正在运行的命令(在Windows 10上)- 这是我的logstash sample.conf文件- input { file { path => "C:\Users\17739\Documents\IIT\CSP586\tutorial\project\ChicagoSocialHub\backend-build-divvy-status\divvy_stations_status.csv" start_position =>

它在控制台上没有显示任何错误。下面是我正在运行的命令(在Windows 10上)-

这是我的
logstash sample.conf
文件-

input {
  file {
    path => "C:\Users\17739\Documents\IIT\CSP586\tutorial\project\ChicagoSocialHub\backend-build-divvy-status\divvy_stations_status.csv"
    start_position => "beginning"
  }
}

filter {
   csv{
      separator => ","
      columns => ["altitude", "availableBikes", "availableDocks", "city", "id", "is_renting", "kioskType", "landMark", "lastCommunicationTime", "latitude", "location", "longitude", "postalCode", "stAddress1", "stAddress2", "stationName", "status", "statusKey", "statusValue", "testStation", "totalDocks"]
      }
}
output {
  elasticsearch { 
  hosts => ["localhost:9200"]
  index => "divvy_stations_status"
  document_type => "status"
   }
  stdout {
  codec => rubydebug
  }
}

这是logstash控制台的输出-

C:\Users\17739\Documents\IIT\CSP586\logstash-6.6.2\bin>logstash --verbose -f logstash-sample.conf
Sending Logstash logs to C:/Users/17739/Documents/IIT/CSP586/logstash-6.6.2/logs which is now configured via log4j2.properties
[2019-03-17T12:56:36,728][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-03-17T12:56:36,745][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.6.2"}
[2019-03-17T12:56:41,603][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch index=>"divvy_stations_status", id=>"f84c43181aab6f7bf9e89c0412ada5b5ead116534f6661194800152751a28e87", hosts=>[//localhost:9200], document_type=>"status", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_1264be19-323c-4896-8214-929f15a74251", enable_metric=>true, charset=>"UTF-8">, workers=>1, manage_template=>true, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, ilm_enabled=>false, ilm_rollover_alias=>"logstash", ilm_pattern=>"{now/d}-000001", ilm_policy=>"logstash-policy", action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2019-03-17T12:56:43,234][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2019-03-17T12:56:43,548][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2019-03-17T12:56:43,695][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-03-17T12:56:43,735][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2019-03-17T12:56:43,739][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2019-03-17T12:56:43,768][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2019-03-17T12:56:43,782][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2019-03-17T12:56:43,801][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2019-03-17T12:56:44,323][INFO ][logstash.inputs.file     ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"C:/Users/17739/Documents/IIT/CSP586/logstash-6.6.2/data/plugins/inputs/file/.sincedb_6f34c293ff88e0ad3c31e4a0f32e43d9", :path=>["C:\\Users\\17739\\Documents\\IIT\\CSP586\\tutorial\\project\\ChicagoSocialHub\\backend-build-divvy-status\\divvy_stations_status.csv"]}
[2019-03-17T12:56:44,369][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x2e991954 run>"}
[2019-03-17T12:56:44,440][INFO ][filewatch.observingtail  ] START, creating Discoverer, Watch with file and sincedb collections
[2019-03-17T12:56:44,441][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-03-17T12:56:44,786][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
C:\Users\17739\Documents\IIT\CSP586\logstash-6.6.2\bin>logstash--verbose-f logstash-sample.conf
将Logstash日志发送到C:/Users/17739/Documents/IIT/CSP586/Logstash-6.6.2/logs,该日志现在通过log4j2.properties配置
[2019-03-17T12:56:36728][WARN][logstash.config.source.multilocal]忽略“pipelines.yml”文件,因为指定了模块或命令行选项
[2019-03-17T12:56:36745][INFO][logstash.runner]正在启动logstash{“logstash.version”=>“6.6.2”}
[2019-03-17T12:56:41603][WARN][logstash.outputs.elasticsearch]您正在使用elasticsearch中设置的已弃用配置设置“document_type”。不推荐使用的设置将继续工作,但计划将来从日志库中删除。Elasticsearch 6.0不推荐使用文档类型,而在7.0中完全删除了文档类型。如果对此有任何疑问,请访问freenode irc上的#logstash频道,避免使用此功能。{:name=>“document_type”,:plugin=>“divvy_stations_status”,id=>“f84c181aab6f7bf9e89c0412ada5b5ead116534f6661194800152751a28e87”,hosts=>[//localhost:9200],document_type=>“status”,enable_metric=>true,codec=>“plainèèèèèèèèbe19-323; c-4896-8214-929;-929; f15; a74251”,enableèmetric=>true,charset=>“UTF-8”“logstash”,模板覆盖=>false,文档作为插入=>false,脚本类型=>“内联”,脚本语言=>“无痛”,脚本变量名称=>“事件”,脚本插入=>false,重试初始插入间隔=>2,重试最大插入间隔=>64,重试冲突=>1,ilm启用=>false,ilm滚动别名=>“logstash”,ilm模式=>“{now/d}-000001”,ilm策略=>“存储策略”“,操作=>“索引”,ssl\u证书\u验证=>true,嗅探=>false,嗅探延迟=>5,超时=>60,池最大值=>1000,池最大值每\u路由=>100,恢复\u延迟=>5,在\u不活动=>10000后验证\u,http\u压缩=>false>}”
[2019-03-17T12:56:43234][INFO][logstash.pipeline]正在启动管道{:pipeline_id=>“main”,“pipeline.workers”=>8,“pipeline.batch.size”=>125,“pipeline.batch.delay”=>50}
[2019-03-17T12:56:43548][INFO][logstash.outputs.elasticsearch]elasticsearch池URL更新{:更改=>{:删除=>[],:添加=>[http://localhost:9200/]}}
[2019-03-17T12:56:43695][WARN][logstash.outputs.elasticsearch]已恢复与ES实例的连接{:url=>”http://localhost:9200/"}
[2019-03-17T12:56:43735][INFO][logstash.outputs.elasticsearch]ES输出版本已确定{:ES_version=>6}
[2019-03-17T12:56:43739][WARN][logstash.outputs.elasticsearch]检测到一个6.x及以上的群集:“type”事件字段将不用于确定文档类型{:es\u version=>6}
[2019-03-17T12:56:43768][INFO][logstash.outputs.elasticsearch]新的elasticsearch输出{:class=>“logstash::outputs::elasticsearch”,:hosts=>[“//localhost:9200”]}
[2019-03-17T12:56:43782][INFO][logstash.outputs.elasticsearch]使用来自{:path=>nil}的映射模板
[2019-03-17T12:56:43801][INFO][logstash.outputs.elasticsearch]正在尝试安装模板{:管理模板=>{“模板”=>“logstash-*”,“版本”=>60001,“设置”=>{“索引.刷新间隔”=>“5s”},“映射”=>{“默认”=>{“动态模板”=>[{“消息字段”=>{“路径匹配”=>“消息”,“匹配映射类型”=>“字符串”,“映射”>>“类型”=>“文本”、“规范”=>false}}}、{“字符串字段”=>{“匹配”=>“*”、“匹配映射类型”=>“字符串”、“映射”=>{“类型”=>“文本”、“规范”=>false、“字段”=>{“关键字”=>{“类型”=>“关键字”、“忽略”=>256}}}}、“属性”=>{@timestamp”=>{“类型”=>“日期”}、@version”=>“版本”=>“geoip”=>“类型”=>“关键字”=>“动态属性”{>“位置”=>{“类型”=>“地理点”},“纬度”=>{“类型”=>“半浮”},“经度”=>{“类型”=>“半浮”}
[2019-03-17T12:56:44323][INFO][logstash.inputs.file]没有sincedb_路径集,根据“路径”设置生成一个路径集{:sincedb_路径=>“C:/Users/17739/Documents/IIT/CSP586/logstash-6.6.2/data/plugins/inputs/file/。sincedb_6f34; 6f34c293ff88e0ad3c31e4a0f32e4d9”,路径=>[“C:\\Users\\17739\\Documents\\IIT\\CSP586\\tutorial\\project\\ChicagoSocialHub\\backend build divvy status\\divvy\u status.csv”]}
[2019-03-17T12:56:44369][INFO][logstash.pipeline]管道已成功启动{:pipeline_id=>“main”,:thread=>“#”}
[2019-03-17T12:56:44440][INFO][filewatch.observingtail]开始创建发现者、使用文件和sincedb集合进行监视
[2019-03-17T12:56:44441][INFO][logstash.agent]正在运行的管道{:计数=>1,:正在运行的管道=>[:main],:未运行的管道=>[]
[2019-03-17T12:56:44786][INFO][logstash.agent]已成功启动logstash API端点{:port=>9600}
运行
logstash--verbose-f logstash sample.conf
命令后,我的期望是这个新的索引
divvy\u status
将在弹性搜索索引中可见(也用kibana检查,也不显示在那里)。这是真的期望吗?这是我的 输出-


解决方案是在文件路径中使用
正斜杠
/
而不是
反斜杠
\
。这很奇怪,因为在windows中,我们总是使用正斜杠作为路径分隔符。更奇怪的是,它从未抛出类似“找不到文件”之类的错误

因此,在
logstash sample.conf
文件中,使用以下命令-

path => "C:/Users/17739/Documents/IIT/CSP586/tutorial/project/ChicagoSocialHub/backend-build-divvy-status/divvy_stations_status.csv" 
@萨皮

当您尝试在Windows平台中运行logstash时

您必须将文件路径更改为

C:/Users/17739/Documents/IIT/CSP586/tutorial/project/ChicagoSocialHub/backend build divvy status/divvy\u status.csv

也就是说,用正斜杠代替反斜杠。 第二

您需要指定


path => "C:/Users/17739/Documents/IIT/CSP586/tutorial/project/ChicagoSocialHub/backend-build-divvy-status/divvy_stations_status.csv"