elasticsearch 日志存储日志中缺少转换错误,elasticsearch,logstash,elasticsearch,Logstash" /> elasticsearch 日志存储日志中缺少转换错误,elasticsearch,logstash,elasticsearch,Logstash" />

elasticsearch 日志存储日志中缺少转换错误

elasticsearch 日志存储日志中缺少转换错误,elasticsearch,logstash,elasticsearch,Logstash,我正在尝试使用logstash将csv文件中的数据输入弹性搜索。我的logsatsh配置文件如下所示: input { file { path => "C:\Users\shreya\Data\RetailData.csv" start_position => "beginning" #sincedb_path => "C:\Users\shreya\null" } } filter { csv {

我正在尝试使用logstash将csv文件中的数据输入弹性搜索。我的logsatsh配置文件如下所示:

input {
    file {
        path => "C:\Users\shreya\Data\RetailData.csv"
        start_position => "beginning" 
        #sincedb_path => "C:\Users\shreya\null"

    }
}
filter {
    csv {
        separator => ","
        id => "Store_ID"
        columns => ["Store","Date","Temperature","Fuel_Price", "MarkDown1", "MarkDown2", "MarkDown3", "MarkDown4", "CPI", "Unemployment", "IsHoliday"]
    }
    mutate {convert => ["Store", "integer"]}
    mutate {convert => ["Date", "date"]}
    mutate {convert => ["Temperature", "float"]}
    mutate {convert => ["Fuel_Price", "float"]}
    mutate {convert => ["CPI", "float"]}
    mutate {convert => ["Unemployment", "float"]}


}
output {
    elasticsearch {
        action => "index"
        hosts => "localhost:9200" 
        index => "store" 
        document_type => "store_retail"     
    }
    stdout {} 
    #stdout {
  #       codec => rubydebug
  #}
}
[2017-12-02T15:56:38,150][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"C:/Users/shreya/logstash-6.0.0/modules/fb_apache/configuration"}
[2017-12-02T15:56:38,165][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"C:/Users/shreya/logstash-6.0.0/modules/netflow/configuration"}
[2017-12-02T15:56:38,243][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2017-12-02T15:56:39,117][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2017-12-02T15:56:42,965][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch action=>"index", hosts=>["localhost:9200"], index=>"store", document_type=>"store_retail", id=>"91a4406a13e9377abb312acf5f6be8e609a685f9c84a5906af957e956119798c">}
[2017-12-02T15:56:43,604][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2017-12-02T15:56:43,604][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2017-12-02T15:56:43,854][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2017-12-02T15:56:43,932][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-12-02T15:56:43,933][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-12-02T15:56:43,964][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2017-12-02T15:56:44,011][ERROR][logstash.pipeline        ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::FilterDelegator:0x3e4985f1 @metric_events_out=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - namespace: [stats, pipelines, main, plugins, filters, e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb, events] key: out value:0, @metric_events_in=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - namespace: [stats, pipelines, main, plugins, filters, e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb, events] key: in value:0, @logger=#<LogStash::Logging::Logger:0x48eebcf8 @logger=#<Java::OrgApacheLoggingLog4jCore::Logger:0x113b0d16>>, @metric_events_time=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - namespace: [stats, pipelines, main, plugins, filters, e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb, events] key: duration_in_millis value:0, @id=\"e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb\", @klass=LogStash::Filters::Mutate, @metric_events=#<LogStash::Instrument::NamespacedMetric:0x7c8acc8 @metric=#<LogStash::Instrument::Metric:0x3afcd9b5 @collector=#<LogStash::Instrument::Collector:0x73e63041 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x60e51f03 @store=#<Concurrent::Map:0x00000000000fb0 entries=3 default_proc=nil>, @structured_lookup_mutex=#<Mutex:0x2209413b>, @fast_lookup=#<Concurrent::Map:0x00000000000fb4 entries=86 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :filters, :e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb, :events]>, @filter=<LogStash::Filters::Mutate convert=>{\"Date\"=>\"date\"}, id=>\"e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb\", enable_metric=>true, periodic_flush=>false>>", :error=>"translation missing: en.logstash.agent.configuration.invalid_plugin_register", :thread=>"#<Thread:0x3cc2461b@C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:290 run>"}
[2017-12-02T15:56:44,042][ERROR][logstash.pipeline        ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<LogStash::ConfigurationError: translation missing: en.logstash.agent.configuration.invalid_plugin_register>, :backtrace=>["C:/Users/shreya/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.1.6/lib/logstash/filters/mutate.rb:186:in `block in register'", "org/jruby/RubyHash.java:1343:in `each'", "C:/Users/shreya/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.1.6/lib/logstash/filters/mutate.rb:184:in `register'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:388:in `register_plugin'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:399:in `block in register_plugins'", "org/jruby/RubyArray.java:1734:in `each'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:399:in `register_plugins'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:801:in `maybe_setup_out_plugins'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:409:in `start_workers'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:333:in `run'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:293:in `block in start'"], :thread=>"#<Thread:0x3cc2461b@C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:290 run>"}
[2017-12-02T15:56:44,058][ERROR][logstash.agent           ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: LogStash::PipelineAction::Create/pipeline_id:main, action_result: false", :backtrace=>nil}
但我遇到了一个错误,无法找到解决方法。我是个新手。我的错误日志如下所示:

input {
    file {
        path => "C:\Users\shreya\Data\RetailData.csv"
        start_position => "beginning" 
        #sincedb_path => "C:\Users\shreya\null"

    }
}
filter {
    csv {
        separator => ","
        id => "Store_ID"
        columns => ["Store","Date","Temperature","Fuel_Price", "MarkDown1", "MarkDown2", "MarkDown3", "MarkDown4", "CPI", "Unemployment", "IsHoliday"]
    }
    mutate {convert => ["Store", "integer"]}
    mutate {convert => ["Date", "date"]}
    mutate {convert => ["Temperature", "float"]}
    mutate {convert => ["Fuel_Price", "float"]}
    mutate {convert => ["CPI", "float"]}
    mutate {convert => ["Unemployment", "float"]}


}
output {
    elasticsearch {
        action => "index"
        hosts => "localhost:9200" 
        index => "store" 
        document_type => "store_retail"     
    }
    stdout {} 
    #stdout {
  #       codec => rubydebug
  #}
}
[2017-12-02T15:56:38,150][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"C:/Users/shreya/logstash-6.0.0/modules/fb_apache/configuration"}
[2017-12-02T15:56:38,165][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"C:/Users/shreya/logstash-6.0.0/modules/netflow/configuration"}
[2017-12-02T15:56:38,243][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2017-12-02T15:56:39,117][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2017-12-02T15:56:42,965][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch action=>"index", hosts=>["localhost:9200"], index=>"store", document_type=>"store_retail", id=>"91a4406a13e9377abb312acf5f6be8e609a685f9c84a5906af957e956119798c">}
[2017-12-02T15:56:43,604][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2017-12-02T15:56:43,604][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2017-12-02T15:56:43,854][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2017-12-02T15:56:43,932][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-12-02T15:56:43,933][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-12-02T15:56:43,964][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2017-12-02T15:56:44,011][ERROR][logstash.pipeline        ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::FilterDelegator:0x3e4985f1 @metric_events_out=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - namespace: [stats, pipelines, main, plugins, filters, e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb, events] key: out value:0, @metric_events_in=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - namespace: [stats, pipelines, main, plugins, filters, e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb, events] key: in value:0, @logger=#<LogStash::Logging::Logger:0x48eebcf8 @logger=#<Java::OrgApacheLoggingLog4jCore::Logger:0x113b0d16>>, @metric_events_time=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - namespace: [stats, pipelines, main, plugins, filters, e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb, events] key: duration_in_millis value:0, @id=\"e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb\", @klass=LogStash::Filters::Mutate, @metric_events=#<LogStash::Instrument::NamespacedMetric:0x7c8acc8 @metric=#<LogStash::Instrument::Metric:0x3afcd9b5 @collector=#<LogStash::Instrument::Collector:0x73e63041 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x60e51f03 @store=#<Concurrent::Map:0x00000000000fb0 entries=3 default_proc=nil>, @structured_lookup_mutex=#<Mutex:0x2209413b>, @fast_lookup=#<Concurrent::Map:0x00000000000fb4 entries=86 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :filters, :e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb, :events]>, @filter=<LogStash::Filters::Mutate convert=>{\"Date\"=>\"date\"}, id=>\"e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb\", enable_metric=>true, periodic_flush=>false>>", :error=>"translation missing: en.logstash.agent.configuration.invalid_plugin_register", :thread=>"#<Thread:0x3cc2461b@C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:290 run>"}
[2017-12-02T15:56:44,042][ERROR][logstash.pipeline        ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<LogStash::ConfigurationError: translation missing: en.logstash.agent.configuration.invalid_plugin_register>, :backtrace=>["C:/Users/shreya/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.1.6/lib/logstash/filters/mutate.rb:186:in `block in register'", "org/jruby/RubyHash.java:1343:in `each'", "C:/Users/shreya/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.1.6/lib/logstash/filters/mutate.rb:184:in `register'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:388:in `register_plugin'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:399:in `block in register_plugins'", "org/jruby/RubyArray.java:1734:in `each'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:399:in `register_plugins'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:801:in `maybe_setup_out_plugins'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:409:in `start_workers'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:333:in `run'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:293:in `block in start'"], :thread=>"#<Thread:0x3cc2461b@C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:290 run>"}
[2017-12-02T15:56:44,058][ERROR][logstash.agent           ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: LogStash::PipelineAction::Create/pipeline_id:main, action_result: false", :backtrace=>nil}
[2017-12-02T15:56:38150][INFO][logstash.modules.scaffold]初始化模块{:module_name=>“fb_apache”,:directory=>“C:/Users/shreya/logstash-6.0.0/modules/fb_apache/configuration”}
[2017-12-02T15:56:38165][INFO][logstash.modules.scaffold]初始化模块{:module_name=>“netflow”,:directory=>“C:/Users/shreya/logstash-6.0.0/modules/netflow/configuration”}
[2017-12-02T15:56:38243][WARN][logstash.config.source.multilocal]忽略“pipelines.yml”文件,因为指定了模块或命令行选项
[2017-12-02T15:56:39117][INFO][logstash.agent]已成功启动logstash API端点{:port=>9600}
[2017-12-02T15:56:42965][WARN][logstash.outputs.elasticsearch]您正在使用elasticsearch中设置的已弃用配置设置“document_type”。不推荐使用的设置将继续工作,但计划将来从日志库中删除。Elasticsearch 6.0不推荐使用文档类型,而在7.0中完全删除了文档类型。如果对此有任何疑问,请访问freenode irc上的#logstash频道,避免使用此功能。{:name=>“document_type”,:plugin=>“index”,hosts=>[“localhost:9200”],index=>“store”,document_type=>“store_retail”,id=>“91a4406a13e9377abb312acf5f6be8e609a685f9c84a5906af957e956119798c”>}
[2017-12-02T15:56:43604][INFO][logstash.outputs.elasticsearch]elasticsearch池URL更新{:更改=>{:删除=>[],:添加=>[http://localhost:9200/]}}
[2017-12-02T15:56:43604][INFO][logstash.outputs.elasticsearch]运行运行状况检查以查看elasticsearch连接是否正常工作{:healthcheck\u url=>http://localhost:9200/,:path=>“/”}
[2017-12-02T15:56:43854][WARN][logstash.outputs.elasticsearch]已恢复与ES实例的连接{:url=>”http://localhost:9200/"}
[2017-12-02T15:56:43932][INFO][logstash.outputs.elasticsearch]使用来自{:path=>nil}的映射模板
[2017-12-02T15:56:43933][INFO][logstash.outputs.elasticsearch]正在尝试安装模板{:管理模板=>{“模板”=>“logstash-*”,“版本”=>60001,“设置”=>{“索引.刷新间隔”=>“5s”},“映射”=>{“默认”=>{“动态模板”=>[{“消息”字段=>{“路径匹配”=>“消息”,“匹配映射类型”=>“字符串”,“映射”>>“类型”=>“文本”、“规范”=>false}}}、{“字符串字段”=>{“匹配”=>“*”、“匹配映射类型”=>“字符串”、“映射”=>{“类型”=>“文本”、“规范”=>false、“字段”=>{“关键字”=>{“类型”=>“关键字”、“忽略”=>256}}}}、“属性”=>{@timestamp”=>{“类型”=>“日期”}、@version”=>“版本”=>“geoip”=>“类型”=>“关键字”=>“动态属性”{>“位置”=>{“类型”=>“地理点”},“纬度”=>{“类型”=>“半浮”},“经度”=>{“类型”=>“半浮”}
[2017-12-02T15:56:43964][INFO][logstash.outputs.elasticsearch]新的elasticsearch输出{:class=>“logstash::outputs::elasticsearch”,:hosts=>[“//localhost:9200”]}
[2017-12-02T15:56:44011][ERROR][logstash.pipeline]注册插件时出错{:pipeline\u id=>“main”,:plugin=>“date\”,id=>“e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb\”,启用度量=>真,周期性\u flush=>假>>,:ERROR=>“翻译缺失:en.logstash.agent.configuration.无效插件寄存器”,:thread=>"#"}
[2017-12-02T15:56:44042][ERROR][logstash.pipeline]由于错误{:pipeline_id=>“main”,:exception=>,:backtrace=>[“C:/Users/shreya/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.1.6/lib/logstash/filters/mutate.rb:186:”在“寄存器中的block”中,“org/jruby/ruby/RubyHash.java:1343:”中,”C:/Users/shreya/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.1.6/lib/logstash/filters/mutate.rb:184:in‘register’,“C:/Users/shreya/logstash-6.0.0/logstash-6.0/log-core/logstash-6.0.0/pipeline.rb:399:in‘register\u插件’,”org/jruby/RubyArray.java:1734:in‘each’,“C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:399:in‘register_plugins’,“C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:801:in‘maybeú戋戋戋戋戋戋戋戋戋戋戋戋戋戋戋25099C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:333:in‘run’,“C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:293:in‘block in start’,:thread=>“#”}
[2017-12-02T15:56:44058][ERROR][logstash.agent]无法执行操作{:id=>:main,:action_type=>logstash::ConvergeResult::FailedAction,:message=>“无法执行操作:logstash::PipelineAction::Create/pipeline_id:main,action_result:false”,:backtrace=>nil}

问题来自其中一个变异过滤器中的转换目标。来自:

有效的转换目标是:整数、浮点、字符串和布尔值

因此,这部分是导致碰撞的原因:

mutate {convert => ["Date", "date"]}

如果要将字符串转换为日期,则必须使用日期筛选器。

使用下面的命令验证配置文件,该命令显示错误详细信息

./logstash -f /etc/logstash/conf.d/your_config_file.conf --config.test_and_exit