Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/iphone/38.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Logstash 带有Filebeat错误的日志存储:无法执行操作_Logstash_Elastic Stack_Logstash Grok_Logstash Configuration_Filebeat - Fatal编程技术网

Logstash 带有Filebeat错误的日志存储:无法执行操作

Logstash 带有Filebeat错误的日志存储:无法执行操作,logstash,elastic-stack,logstash-grok,logstash-configuration,filebeat,Logstash,Elastic Stack,Logstash Grok,Logstash Configuration,Filebeat,嗨,我正在尝试用Filebeat和Logstash设置日志分析。 下面是我在中所做的更改 filebeat.inputs: - type: log enabled: true paths: - D:\elasticsearch-5.4.3\elasticsearch-5.4.3\logs\elasticsearch.log output.logstash: # The Logstash hosts hosts: ["localhost:5044"] 这是我的日志存储配

嗨,我正在尝试用Filebeat和Logstash设置日志分析。 下面是我在中所做的更改

filebeat.inputs:
- type: log
  enabled: true
  paths:
    - D:\elasticsearch-5.4.3\elasticsearch-5.4.3\logs\elasticsearch.log

output.logstash:
  # The Logstash hosts
  hosts: ["localhost:5044"]
这是我的日志存储配置文件

input {
  beats {
    port => 5044
  }
}

filter {
    grok {
      match => { "message" => "%{plugins}" }
    }
    date {
    match => [ "timestamp" , "yyyy-MM-DD:HH:mm:ss" ]
  }
}

output {
  elasticsearch {
    hosts => ["localhost:9200"]
}
}
运行上述程序时,我看到以下错误:

[2019-10-22T06:07:32,915][ERROR][logstash.javapipeline    ] Pipeline aborted due
 to error {:pipeline_id=>"main", :exception=>#<Grok::PatternError: pattern %{plu
gins} not defined>, :backtrace=>["D:/logstash-7.1.0/logstash-7.1.0/vendor/bundle
/jruby/2.5.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:123:in `block in compile'", "
org/jruby/RubyKernel.java:1425:in `loop'", "D:/logstash-7.1.0/logstash-7.1.0/ven
dor/bundle/jruby/2.5.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:93:in `compile'", "
D:/logstash-7.1.0/logstash-7.1.0/vendor/bundle/jruby/2.5.0/gems/logstash-filter-
grok-4.0.4/lib/logstash/filters/grok.rb:281:in `block in register'", "org/jruby/
RubyArray.java:1792:in `each'", "D:/logstash-7.1.0/logstash-7.1.0/vendor/bundle/
jruby/2.5.0/gems/logstash-filter-grok-4.0.4/lib/logstash/filters/grok.rb:275:in
`block in register'", "org/jruby/RubyHash.java:1419:in `each'", "D:/logstash-7.1
.0/logstash-7.1.0/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.0.4/lib/
logstash/filters/grok.rb:270:in `register'", "org/logstash/config/ir/compiler/Ab
stractFilterDelegatorExt.java:56:in `register'", "D:/logstash-7.1.0/logstash-7.1
.0/logstash-core/lib/logstash/java_pipeline.rb:191:in `block in register_plugins
'", "org/jruby/RubyArray.java:1792:in `each'", "D:/logstash-7.1.0/logstash-7.1.0
/logstash-core/lib/logstash/java_pipeline.rb:190:in `register_plugins'", "D:/log
stash-7.1.0/logstash-7.1.0/logstash-core/lib/logstash/java_pipeline.rb:446:in `m
aybe_setup_out_plugins'", "D:/logstash-7.1.0/logstash-7.1.0/logstash-core/lib/lo
gstash/java_pipeline.rb:203:in `start_workers'", "D:/logstash-7.1.0/logstash-7.1
.0/logstash-core/lib/logstash/java_pipeline.rb:145:in `run'", "D:/logstash-7.1.0
/logstash-7.1.0/logstash-core/lib/logstash/java_pipeline.rb:104:in `block in sta
rt'"], :thread=>"#<Thread:0x15997940 run>"}
[2019-10-22T06:07:32,970][ERROR][logstash.agent           ] Failed to execute ac
tion {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message
=>"Could not execute action: PipelineAction::Create<main>, action_result: false"
, :backtrace=>nil}
[2019-10-22T06:07:32915][ERROR][logstash.javapipeline]由于
到错误{:pipeline_id=>“main”,:exception=>#,:backtrace=>[“D:/logstash-7.1.0/logstash-7.1.0/vendor/bundle
/jruby/2.5.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:123:在“编译中的块”中,“
org/jruby/RubyKernel.java:1425:in'loop',“D:/logstash-7.1.0/logstash-7.1.0/ven
dor/bundle/jruby/2.5.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:93:in'compile'”
D:/logstash-7.1.0/logstash-7.1.0/vendor/bundle/jruby/2.5.0/gems/logstash-filter-
grok-4.0.4/lib/logstash/filters/grok.rb:281:in‘block in register’,“org/jruby”/
java:1792:each中的“D:/logstash-7.1.0/logstash-7.1.0/vendor/bundle/
jruby/2.5.0/gems/logstash-filter-grok-4.0.4/lib/logstash/filters/grok.rb:275:in
`寄存器“”中的块,“org/jruby/RubyHash.java:1419:in`each'”,D:/logstash-7.1
.0/logstash-7.1.0/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.0.4/lib/
logstash/filters/grok.rb:270:in'register'“org/logstash/config/ir/compiler/Ab
stractFilterDelegatorExt.java:56:in'register',“D:/logstash-7.1.0/logstash-7.1
.0/logstash core/lib/logstash/java_pipeline.rb:191:in`block-in-register_插件
“,“org/jruby/RubyArray.java:1792:in`each'”,D:/logstash-7.1.0/logstash-7.1.0
/logstash core/lib/logstash/java_pipeline.rb:190:in`register_plugins'“D:/log
stash-7.1.0/logstash-7.1.0/logstash-core/lib/logstash/java_-pipeline.rb:446:in`m
aybe_setup_out_plugins',“D:/logstash-7.1.0/logstash-7.1.0/logstash-core/lib/lo
gstash/java_pipeline.rb:203:in'start_workers',“D:/logstash-7.1.0/logstash-7.1
.0/logstash core/lib/logstash/java_pipeline.rb:145:in‘run’,“D:/logstash-7.1.0
/logstash-7.1.0/logstash-core/lib/logstash/java_-pipeline.rb:104:in`block-in-sta
rt'”,:thread=>“#”}
[2019-10-22T06:07:32970][ERROR][logstash.agent]无法执行ac
操作{:id=>:main,:action_type=>LogStash::ConvergeResult::FailedAction,:message
=>“无法执行操作:PipelineAction::Create,操作\u结果:false”
,:backtrace=>nil}
我对这种整合还比较陌生,我不确定我应该去哪里研究。
请帮帮我。

问题似乎出在你身上

grok {
  match => { "message" => "%{plugins}" }
}
什么是
%{plugins}
?它不是预定义的
grok
模式。可以找到
grok
模式列表

另外,来自的
grok
模式的语法是
%{syntax:SEMANTIC}
。你可以这样做

grok {
  match => { "message", "%{GREEDYDATA:plugins}" }
}

问题似乎出在你身上

grok {
  match => { "message" => "%{plugins}" }
}
什么是
%{plugins}
?它不是预定义的
grok
模式。可以找到
grok
模式列表

另外,来自的
grok
模式的语法是
%{syntax:SEMANTIC}
。你可以这样做

grok {
  match => { "message", "%{GREEDYDATA:plugins}" }
}

尝试给出
“%{plugins}”
的数据类型

您可以从中找到数据类型


若这不起作用,请尝试删除日期筛选器,然后重试。

尝试提供数据类型为“%{plugins}”

您可以从中找到数据类型


如果这不起作用,请尝试删除日期筛选器,然后重试。

显然,由于配置文件中的某些regexp语法错误,可能会发生此类错误。这就是破解。

显然,由于配置文件中的某些regexp语法错误,可能会发生此类错误。这就是破解。

Plugin实际上是日志文件中的一个现有单词,我是否仍应将其用作
“%{greedydydata:plugins}”
?我尝试了上面的方法,但是我看不到在Elasticsearch中创建的索引。Logstash显示
[2019-10-22T12:06:33120][INFO][Logstash.agent]已成功启动Logstash API端点{:port=>9600}
并且在执行之后没有更改。我以admin的身份在cmd中运行,
logstash-f myfile.conf
@SandepKanaBar
“%{GREEDYDATA:plugins}”
的意思是,您的整个消息将被收集到
plugins
字段中,然后该字段将被索引到ES中。尝试使用
--debug
选项运行logstash,即
logstash--debug-f my file.conf
。此外,由于您尚未在
输出{elasticsearch{..}
中指定索引名称,因此默认情况下,索引将发生在
日志存储yyyy.mm.dd
索引中,其中
yyyy.mm.dd
反映了今天的UTC日期。在运行logstash之前,执行
GET\u cat/index
,然后再次运行相同的命令,以了解是否创建了任何索引。Plugin实际上是日志文件中的一个现有单词,我是否仍应将其用作
“%{greedydydata:plugins}”
?我尝试了上面的方法,但是我看不到在Elasticsearch中创建的索引。Logstash显示
[2019-10-22T12:06:33120][INFO][Logstash.agent]已成功启动Logstash API端点{:port=>9600}
并且在执行之后没有更改。我以admin的身份在cmd中运行,
logstash-f myfile.conf
@SandepKanaBar
“%{GREEDYDATA:plugins}”
的意思是,您的整个消息将被收集到
plugins
字段中,然后该字段将被索引到ES中。尝试使用
--debug
选项运行logstash,即
logstash--debug-f my file.conf
。此外,由于您尚未在
输出{elasticsearch{..}
中指定索引名称,因此默认情况下,索引将发生在
日志存储yyyy.mm.dd
索引中,其中
yyyy.mm.dd
反映了今天的UTC日期。在运行logstash之前,执行
GET\u cat/index
,然后再次运行相同的命令,以了解是否创建了任何索引。