elasticsearch xml的日志存储配置,elasticsearch,logstash,kibana,elastic-stack,logstash-configuration,elasticsearch,Logstash,Kibana,Elastic Stack,Logstash Configuration" /> elasticsearch xml的日志存储配置,elasticsearch,logstash,kibana,elastic-stack,logstash-configuration,elasticsearch,Logstash,Kibana,Elastic Stack,Logstash Configuration" />

elasticsearch xml的日志存储配置

elasticsearch xml的日志存储配置,elasticsearch,logstash,kibana,elastic-stack,logstash-configuration,elasticsearch,Logstash,Kibana,Elastic Stack,Logstash Configuration,我不熟悉elasticsearch和logstash,我想为logstash创建配置文件,它可以加载XML文件数据,因此我可以使用kibana在elasticsearch中进行搜索。如何创建此配置 XML文件结构为: <?xml version="1.0" encoding="ISO-8859-15"?> <ORDERS> <ORDER> <COMPANY_CODE>CHU</COMPANY_CODE> <ETABLISSEME

我不熟悉elasticsearch和logstash,我想为logstash创建配置文件,它可以加载XML文件数据,因此我可以使用kibana在elasticsearch中进行搜索。如何创建此配置

XML文件结构为:

<?xml version="1.0" encoding="ISO-8859-15"?>

<ORDERS>
<ORDER>
<COMPANY_CODE>CHU</COMPANY_CODE>
<ETABLISSEMENET_CODE>CHU</ETABLISSEMENET_CODE>
<FOURNISSEUR>BI</FOURNISSEUR>
<DESTINATAIRE>CHUSUDRUN2</DESTINATAIRE>
<NUM_COMMANDE_MYTOWER>342</NUM_COMMANDE_MYTOWER>
<NUM_COMMANDE_CHU>CMD12345</NUM_COMMANDE_CHU>
<INSTRUCTIONS>COLIS</INSTRUCTIONS>
<ETAT>4</ETAT>
<DATE_DE_COMMANDE>01-01-2018</DATE_DE_COMMANDE>
<DATE_DE_DISPONIBILITE>01-01-2018</DATE_DE_DISPONIBILITE>
<MONTANT_HT>3695.0</MONTANT_HT>
<DATE_DE_CREATION></DATE_DE_CREATION>
<POIDS_BRUT>20.0</POIDS_BRUT>
<NOMBRE_COLIS>3</NOMBRE_COLIS>

楚
楚
毕
楚苏德鲁2
342
CMD12345
科利斯
4.
01-01-2018
01-01-2018
3695
20
3.

下面是logstash中的xml配置示例:

input {  
file 
{
    path => "/home/Test_xml.xml"
    start_position => "beginning"
     codec => multiline 
    {
        pattern => "^<\?book .*\>"
        negate => true
        what => "previous"
    }
    sincedb_path => "/dev/null"
  }
}

filter 
{
   xml {
    source => "message"
    target => "parsed"
  }
  split {
    field => "[parsed][book]"
    add_field => {
      bookAuthor                => "%{[parsed][book][author]}"
      title                 => "%{[parsed][book][title]}"
      genre                 => "%{[parsed][book][genre]}"
      price                => "%{[parsed][book][price]}"
      publish_date             => "%{[parsed][book][publish_date]}"
      description        => "%{[parsed][book][description]}"
    }
  }
}

output 
{
    elasticsearch {
hosts => "localhost:9200"
index => "xml_test"
}
    stdout 
    {
        codec => rubydebug
    }
}
输入{
文件
{
path=>“/home/Test_xml.xml”
开始位置=>“开始”
编解码器=>多行
{
模式=>“^”
否定=>true
什么=>“以前的”
}
sincedb_path=>“/dev/null”
}
}
滤器
{
xml{
source=>“消息”
目标=>“已解析”
}
分裂{
字段=>“[parsed][book]”
添加字段=>{
bookAuthor=>“%{[parsed][book][author]}”
title=>“%{[parsed][book][title]}”
流派=>“%{[parsed][book][genre]}”
价格=>“%{[parsed][book][price]}”
publish_date=>“%{[parsed][book][publish_date]}”
description=>“%{[parsed][book][description]}”
}
}
}
输出
{
弹性搜索{
主机=>“localhost:9200”
索引=>“xml_测试”
}
stdout
{
编解码器=>rubydebug
}
}
链接

我试图用logstash将数据插入elasticsearh long back。 希望这能奏效