Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/sharepoint/4.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
使用logstash如何获取[]到.csv文件中的值?_Csv_Logstash - Fatal编程技术网

使用logstash如何获取[]到.csv文件中的值?

使用logstash如何获取[]到.csv文件中的值?,csv,logstash,Csv,Logstash,我想将[]中的日志值发送到.csv文件,文本中不带任何“” input { file { path => "D:\logstash-5.1.1\logstash-5.1.1\bin\slowlog.log" start_position => "beginning" } } filter { grok { match => ["message", "\[(?<TIMESTAMP>[^\]]*)\][^\[\]]*\[(?

我想将[]中的日志值发送到.csv文件,文本中不带任何“”

    input {
  file {
    path => "D:\logstash-5.1.1\logstash-5.1.1\bin\slowlog.log"
    start_position => "beginning"
  }
}

filter {
  grok {
    match => ["message", "\[(?<TIMESTAMP>[^\]]*)\][^\[\]]*\[(?<LEVEL>[^\]]*)\][^\[\]]*\[(?<QUERY>[^\]]*)\][^\[\]]*\[(?<QUERY1>[^\]]*)\][^\[\]]*\[(?<INDEX-NAME>[^\]]*)\][^\[\]]*\[(?<SHARD>[^\]]*)\][^\[\]]*\[(?<TOOK>[^\]]*)\][^\[\]]*\[(?<TOOKM>[^\]]*)\][^\[\]]*\[(?<types>[^\]]*)\][^\[\]]*\[(?<stats>[^\]]*)\][^\[\]]*\[(?<search_type>[^\]]*)\][^\[\]]*\[(?<total_shards>[^\]]*)\][^\[\]]*\[(?<source_query>[^\]]*)\][^\[\]]*\[(?<extra_source>[^\]]*)\][^\[\]]*,"]
  }
}

output {
      csv {
      fields => ["TIMESTAMP","LEVEL","QUERY","QUERY1","INDEX-NAME","SHARD","TOOK","TOOKM","types","stats","search_type","total_shards","source_query","extra_source"]
       path => "D:\logstash-5.1.1\logstash-5.1.1\bin\output.csv"
      spreadsheet_safe => false
   }

}
但我想要的是

2017-01-13 12:58:09,843]  --column 1
WARN                      --column 2
index.search.slowlog.query --column 3
Spectra                    --column 4
testindex-stats            --column 5
2                          --column 6
15.3ms                     --column 7
so on..                       column..

我认为这是logstash 5.1.1中的一个问题。

您可以使用如下grok模式:

filter {
  grok {
    match => ["message", "\[(?<field1>[^\]]*)\][^\[\]]*\[(?<field2>[^\]]*)\][^\[\]]*\[(?<field3>[^\]]*)\][^\[\]]*\[(?<field4>[^\]]*)\][^\[\]]*\[(?<field5>[^\]]*)\][^\[\]]*\[(?<field6>[^\]]*)\][^\[\]]*\[(?<field7>[^\]]*)\][^\[\]]*\[(?<field8>[^\]]*)\][^\[\]]*\[(?<field9>[^\]]*)\][^\[\]]*\[(?<field10>[^\]]*)\][^\[\]]*\[(?<field11>[^\]]*)\][^\[\]]*\[(?<field12>[^\]]*)\][^\[\]]*\[(?<field13>[^\]]*)\][^\[\]]*\[(?<field14>[^\]]*)\][^\[\]]*"]
  }
}
 output {
   csv => { 
    fields => ["field1","field2","field3","field4","field5","field6","field7","field8","field9","field10","field11","field12","field13","field14"] 
   } 
 }

问题不是分隔符的问题,请查看此问题以了解原因:感谢您的代码,但这是logstash 5.1.1的问题,我更新了我的问题如果我将文件名更改为unix样式并在mac上运行,则没有语法错误,因此不确定您遇到了什么问题。另外,当您将csv输出上的字段更新为特定的名称列表时,您必须更新顶部模式以匹配grok模式中的相同名称。我通过更改grok模式名称更新了问题,但输出是相同的,没有更改..更改\to/文件名您想这样写吗“D:/logstash-5.1.1/logstash-5.1.1/bin/output.csv”
 output {
   csv => { 
    fields => ["field1","field2","field3","field4","field5","field6","field7","field8","field9","field10","field11","field12","field13","field14"] 
   } 
 }