Warning: file_get_contents(/data/phpspider/zhask/data//catemap/1/dart/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
<img src="//i.stack.imgur.com/RUiNP.png" height="16" width="18" alt="" class="sponsor tag img">elasticsearch 如何使用Logstash ans jdbc_流式过滤器将数据从HTTP输入发送到ElasticSearch?_<img Src="//i.stack.imgur.com/RUiNP.png" Height="16" Width="18" Alt="" Class="sponsor Tag Img">elasticsearch_Logstash_Kibana_Elastic Stack - Fatal编程技术网 elasticsearch 如何使用Logstash ans jdbc_流式过滤器将数据从HTTP输入发送到ElasticSearch?,elasticsearch,logstash,kibana,elastic-stack,elasticsearch,Logstash,Kibana,Elastic Stack" /> elasticsearch 如何使用Logstash ans jdbc_流式过滤器将数据从HTTP输入发送到ElasticSearch?,elasticsearch,logstash,kibana,elastic-stack,elasticsearch,Logstash,Kibana,Elastic Stack" />

elasticsearch 如何使用Logstash ans jdbc_流式过滤器将数据从HTTP输入发送到ElasticSearch?

elasticsearch 如何使用Logstash ans jdbc_流式过滤器将数据从HTTP输入发送到ElasticSearch?,elasticsearch,logstash,kibana,elastic-stack,elasticsearch,Logstash,Kibana,Elastic Stack,我想使用logstash将数据从Http发送到elasticsearch,我想使用jdbc_流媒体过滤器插件丰富我的数据。这是我的日志存储配置: input { http { id => "sensor_data_http_input" user => "sensor_data" password => "sensor_data" } } filter { jdbc_streaming { jdbc_driver_library =

我想使用logstash将数据从Http发送到elasticsearch,我想使用jdbc_流媒体过滤器插件丰富我的数据。这是我的日志存储配置:

input {
  http {
    id => "sensor_data_http_input"
    user => "sensor_data"
    password => "sensor_data"
  }
}

filter {
  jdbc_streaming {
    jdbc_driver_library => "E:\ElasticStack\mysql-connector-java-8.0.18\mysql-connector-java-8.0.18.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    jdbc_connection_string => "jdbc:mysql://localhost:3306/sensor_metadata"
    jdbc_user => "elastic"
    jdbc_password => "hide"
    statement => "select st.sensor_type as sensorType, l.customer as customer, l.department as department, l.building_name as buildingName, l.room as room, l.floor as floor, l.location_on_floor as locationOnFloor, l.latitude, l.longitude from sensors s inner join sensor_type st on s.sensor_type_id=st.sensor_type_id inner join location l on s.location_id=l.location_id where s.sensor_id= :sensor_identifier"
    parameters => { "sensor_identifier" => "sensor_id"}
    target => lookupResult
  }
  mutate {
    rename => {"[lookupResult][0][sensorType]" => "sensorType"}
    rename => {"[lookupResult][0][customer]" => "customer"}
    rename => {"[lookupResult][0][department]" => "department"}
    rename => {"[lookupResult][0][buildingName]" => "buildingName"}
    rename => {"[lookupResult][0][room]" => "room"}
    rename => {"[lookupResult][0][floor]" => "floor"}
    rename => {"[lookupResult][0][locationOnFloor]" => "locationOnFloor"}

    add_field => {
            "location" => "%{lookupResult[0]latitude},%{lookupResult[0]longitude}"
        }

    remove_field => ["lookupResult", "headers", "host"]
  }
}

output {
  elasticsearch {
    hosts =>["localhost:9200"]
    index => "sensor_data-%{+YYYY.MM.dd}"
    user => "elastic"
    password => "hide"
  }
}
但当我启动logstash时,我看到以下错误:

[2020-01-09T22:57:16260]
[错误][logstash.javapipeline]
[main]管道由于错误而中止{
:pipeline_id=>“main”,
:exception=>#,
:回溯=>[
“org/jruby/java/addons/KernelJavaAddons.java:29:in`to_java'”,
“E:/ElasticStack/Logstash/Logstash-7.4.1/vendor/bundle/jruby/2.5.0/gems/Logstash-filter-jdbc_streaming-1.0.7/lib/Logstash/plugin_mixins/jdbc_streaming.rb:48:'prepare_jdbc_connection'中,
“E:/ElasticStack/Logstash/Logstash-7.4.1/vendor/bundle/jruby/2.5.0/gems/Logstash-filter-jdbc_streaming-1.0.7/lib/Logstash/filters/jdbc_streaming.rb:200:‘准备连接的jdbc_缓存’”,
“E:/ElasticStack/Logstash/Logstash-7.4.1/vendor/bundle/jruby/2.5.0/gems/Logstash-filter-jdbc_streaming-1.0.7/lib/Logstash/filters/jdbc_streaming.rb:116:in‘register’,“org/Logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:56:in‘register’”,
“E:/ElasticStack/Logstash/Logstash-7.4.1/Logstash core/lib/Logstash/java_pipeline.rb:195:in`block in register_plugins'”,“org/jruby/RubyArray.java:1800:in`each'”,
“E:/ElasticStack/Logstash/Logstash-7.4.1/Logstash core/lib/Logstash/java_pipeline.rb:194:in`register_plugins'”,
“E:/ElasticStack/Logstash/Logstash-7.4.1/Logstash core/lib/Logstash/java_pipeline.rb:468:in`maybe_setup_out_plugins'”,
“E:/ElasticStack/Logstash/Logstash-7.4.1/Logstash-core/lib/Logstash/java_-pipeline.rb:207:in`start_-workers'”,
“E:/ElasticStack/Logstash/Logstash-7.4.1/Logstash-core/lib/Logstash/java_-pipeline.rb:149:in‘run’,
“E:/ElasticStack/Logstash/Logstash-7.4.1/Logstash core/lib/Logstash/java_pipeline.rb:108:in`block in start'”,
:thread=>“#”
}
[2020-01-09T22:57:16598]
[错误][logstash.agent]无法执行操作{
:id=>:main,
:action_type=>LogStash::ConvergeResult::FailedAction,
:message=>“无法执行操作:PipelineAction::Create,操作\u结果:false”,
:backtrace=>nil
}

我正在用mysql数据库中的一些数据丰富我的http输入,但它根本不会启动logstash。

我看到两个潜在的问题,但您需要检查这里真正的问题是什么:

  • 当您将最新的jdbc驱动程序与较新的jdk版本结合使用时,可能会出现类加载器问题。在github,围绕这一点存在着几个问题。 将驱动程序放在logstash文件夹下的
    /vendor/jar/jdbc/
    (您需要先创建此文件夹)。如果这不起作用,请将驱动程序移动到
    /logstash core\lib\jars
    下,并且不要在配置文件中提供任何驱动程序路径:
    jdbc\u driver\u library=>“”

  • 通过完全从配置文件中删除jdbc\u driver\u library选项,以及如上所述,将jdbc\u driver\u class设置为com.mysql.cj.jdbc.driver

    是否定义了输出?您使用的是哪个版本的jdbc\u流媒体插件?我看到一些关于早于1.0.7的版本在java方面存在问题的传言。如果您处于该状态,请尝试升级到最新版本。@IanGabes我认为这是jdbc_streaming的最新版本,因为它默认安装在logstash中,而我的logstash是最新版本。很高兴听到我之前接受的答案帮助您解决了这个问题;)@阿里-看起来你的答案并没有添加任何没有说明的内容。你的意思是发布一个新的答案并接受它吗?
    [2020-01-09T22:57:16,260]
    [ERROR][logstash.javapipeline]
    [main] Pipeline aborted due to error {
        :pipeline_id=>"main", 
        :exception=>#<TypeError: failed to coerce jdk.internal.loader.ClassLoaders$AppClassLoader to java.net.URLClassLoader>, 
        :backtrace=>[
            "org/jruby/java/addons/KernelJavaAddons.java:29:in `to_java'", 
            "E:/ElasticStack/Logstash/logstash-7.4.1/vendor/bundle/jruby/2.5.0/gems/logstash-filter-jdbc_streaming-1.0.7/lib/logstash/plugin_mixins/jdbc_streaming.rb:48:in `prepare_jdbc_connection'", 
            "E:/ElasticStack/Logstash/logstash-7.4.1/vendor/bundle/jruby/2.5.0/gems/logstash-filter-jdbc_streaming-1.0.7/lib/logstash/filters/jdbc_streaming.rb:200:in `prepare_connected_jdbc_cache'", 
            "E:/ElasticStack/Logstash/logstash-7.4.1/vendor/bundle/jruby/2.5.0/gems/logstash-filter-jdbc_streaming-1.0.7/lib/logstash/filters/jdbc_streaming.rb:116:in `register'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:56:in `register'", 
            "E:/ElasticStack/Logstash/logstash-7.4.1/logstash-core/lib/logstash/java_pipeline.rb:195:in `block in register_plugins'", "org/jruby/RubyArray.java:1800:in `each'", 
            "E:/ElasticStack/Logstash/logstash-7.4.1/logstash-core/lib/logstash/java_pipeline.rb:194:in `register_plugins'", 
            "E:/ElasticStack/Logstash/logstash-7.4.1/logstash-core/lib/logstash/java_pipeline.rb:468:in `maybe_setup_out_plugins'", 
            "E:/ElasticStack/Logstash/logstash-7.4.1/logstash-core/lib/logstash/java_pipeline.rb:207:in `start_workers'", 
            "E:/ElasticStack/Logstash/logstash-7.4.1/logstash-core/lib/logstash/java_pipeline.rb:149:in `run'", 
            "E:/ElasticStack/Logstash/logstash-7.4.1/logstash-core/lib/logstash/java_pipeline.rb:108:in `block in start'"], 
        :thread=>"#<Thread:0x17fa8113 run>"
    }
    [2020-01-09T22:57:16,598]
    [ERROR][logstash.agent] Failed to execute action {
        :id=>:main, 
        :action_type=>LogStash::ConvergeResult::FailedAction, 
        :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", 
        :backtrace=>nil
    }