Logstash 如何使用LogMine自动生成Grok模式

Logstash 如何使用LogMine自动生成Grok模式,logstash,logstash-grok,Logstash,Logstash Grok,我正在尝试使用LogMine自动生成GROK模式 日志样本: Error IGXL error [Slot 2, Chan 16, Site 0] HSDMPI:0217 : TSC3 Fifo Edge EG0-7 Underflow. Please check the timing programming. Edge events should be fired in the sequence and the time between two edges should be m

我正在尝试使用LogMine自动生成GROK模式

日志样本:

Error   IGXL    error [Slot 2, Chan 16, Site 0] HSDMPI:0217 : TSC3 Fifo Edge EG0-7 Underflow.  Please check the timing programming.  Edge events should be fired in the sequence and the time between two edges should be more than 2 MOSC ticks.    
Error   IGXL    error [Slot 2, Chan 18, Site 0] HSDMPI:0217 : TSC3 Fifo Edge EG0-7 Underflow.  Please check the timing programming.  Edge events should be fired in the sequence and the time between two edges should be more than 2 MOSC ticks.    
对于上述日志,我得到以下模式:

re.compile('^(?P<Event>.*?)\\s+(?P<Tester>.*?)\\s+(?P<State>.*?)\\s+(?P<Slot>.*?)\\s+(?P<Instrument>.*?)\\s+(?P<Content1>.*?):\\s+(?P<Content>.*?)$') 
代码:LogMine从以下链接导入:

导入系统 导入操作系统 sys.path.append(“../”) 进口原木 input_dir='E:\LogMine\LogMine'#日志文件的输入目录 output_dir='E:\LogMine\LogMine/output/'#解析结果的输出目录 log_file='E:\LogMine\LogMine/log_teradyne.txt'#输入的日志文件名 日志格式=''HDFS日志格式 级别=1#模式层次的级别 max_dist=0.001#集群中任何日志消息与集群代表之间的最大距离 k=1#消息距离权重(默认值:1) 正则表达式=[]#用于可选预处理的正则表达式列表(默认值:[]) 打印(os.getcwd()) LogParser(输入目录、输出目录、日志格式、rex=regex、levels=levels、max\u dist=max\u dist、k=k) parser.parse(日志文件)
这段代码只返回解析后的CSV文件,我希望生成GROK模式,并在稍后的Logstash应用程序中使用它来解析日志。

Awesome idea!但我不知道怎么做。我真的很想找到一种在ElasticStack或GrafanaLoki中复制此功能的方法。
%{LOGLEVEL:level}    *%{DATA:Instrument} %{LOGLEVEL:State} \[%{DATA:slot} %{DATA:slot} %{DATA:channel} %{DATA:channel} %{DATA:Site}] %{DATA:Tester} : %{DATA:Content}    
import sys    
import os    
sys.path.append('../')    
import LogMine    

input_dir  ='E:\LogMine\LogMine' # The input directory of log file    
output_dir ='E:\LogMine\LogMine/output/' # The output directory of parsing  results    
log_file   ='E:\LogMine\LogMine/log_teradyne.txt' # The input log file name    
log_format ='<Event> <Tester> <State> <Slot> <Instrument><content> <contents> <context> <desc> <junk> ' # HDFS log format     
levels     =1 # The levels of hierarchy of patterns     
max_dist   =0.001 # The maximum distance between any log message in a cluster and the cluster representative     
k          =1 # The message distance weight (default: 1)     
regex      =[]  # Regular expression list for optional preprocessing (default: [])     

print(os.getcwd())    
parser = LogMine.LogParser(input_dir, output_dir, log_format, rex=regex, levels=levels, max_dist=max_dist, k=k)     
parser.parse(log_file)