Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/mysql/71.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
使用logstash将数据从mysql移动到elastic_Mysql_<img Src="//i.stack.imgur.com/RUiNP.png" Height="16" Width="18" Alt="" Class="sponsor Tag Img">elasticsearch_Logstash - Fatal编程技术网 elasticsearch,logstash,Mysql,elasticsearch,Logstash" /> elasticsearch,logstash,Mysql,elasticsearch,Logstash" />

使用logstash将数据从mysql移动到elastic

使用logstash将数据从mysql移动到elastic,mysql,elasticsearch,logstash,Mysql,elasticsearch,Logstash,我正在尝试使用logstash将mysql数据表移动到elastic,该表包含大约100万条记录,有两列数据类型为date,它们的值类似于2002-09-172014-07-212004-11-02 logstash.config文件在这里 input { jdbc { jdbc_connection_string => "jdbc:mysql://***/MyDB?zeroDateTimeBehavior=convertToNull" jdbc_

我正在尝试使用logstash将mysql数据表移动到elastic,该表包含大约100万条记录,有两列数据类型为date,它们的值类似于
2002-09-172014-07-212004-11-02

logstash.config文件在这里

input {
  jdbc { 
    jdbc_connection_string => "jdbc:mysql://***/MyDB?zeroDateTimeBehavior=convertToNull"
    jdbc_user => "myuser"
    jdbc_password => "mypass"
    jdbc_driver_library => "C:\Program Files (x86)\MySQL\Connector J 5.1\mysql-connector-java-5.1.49.jar"
    jdbc_driver_class => "com.mysql.cj.jdbc.Driver"
    statement => "select * from mydata"
    }
  }
output {
 elasticsearch {
  "hosts" => "http://localhost:9200"
  "index" => "properties"
  "document_type" => "data"
  "document_id" => "%{mydataId}"
  }
  stdout { codec => json_lines }
}
当我运行这个文件时,logstash启动,它将加载记录保持为elastic,但它在某些记录处停止,并显示此异常

Exception when executing JDBC query 
{:exception=>"Java::OrgJodaTime::IllegalInstantException: 
Illegal instant due to time zone offset transition (daylight savings time 'gap'): 
2001-04-27T00:00:00.000 (Africa/Cairo)"}
我试着用谷歌搜索它,但我找不到解决办法 我不知道它为什么会显示这个错误,我根本不需要时间,我只需要日期

我是否应该向logstash配置中添加任何内容来解决此问题?如果我需要跳过所有显示错误的行,有没有办法继续并跳过错误或将其记录在某个位置,而不是停止推送过程