Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/logging/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Nginx 日志存储错误:代理-无法执行操作_Nginx_Logging_Logstash_Elastic Stack_Elk - Fatal编程技术网

Nginx 日志存储错误:代理-无法执行操作

Nginx 日志存储错误:代理-无法执行操作,nginx,logging,logstash,elastic-stack,elk,Nginx,Logging,Logstash,Elastic Stack,Elk,我一直在关注如何在nginx日志中使用ELK堆栈。 我已经创建了nginx.conf来配置如何获取日志,但是当我键入:bin/logstash-f/etc/logstash/conf.d/nginx.conf 我得到这个错误: [错误]2020-11-13 14:59:15.254[收敛] PipelineAction::Create]代理-无法执行操作 {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception

我一直在关注如何在nginx日志中使用ELK堆栈。 我已经创建了nginx.conf来配置如何获取日志,但是当我键入:
bin/logstash-f/etc/logstash/conf.d/nginx.conf

我得到这个错误:

[错误]2020-11-13 14:59:15.254[收敛] PipelineAction::Create]代理-无法执行操作 {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>“LogStash::ConfigurationError”,:message=>“应为 [A-Za-z0-9_-]、[\t\r\n]、“#”、“=>”位于第9行第8列(字节 135)在输入{\n\t\n文件{\n路径=> [“/var/log/nginx/access.log”,“/var/log/nginx/error.log”]\n
type=>“nginx”\n}\n筛选器{\n\n grok“, :backtrace=>[“/usr/share/logstash/logstash core/lib/logstash/compiler.rb:32:in “org/logstash/execution/AbstractPipelineExt.java:184:ininitialize'”, “org/logstash/execution/JavaBasePipelineExt.java:69:in
initialize'”、/usr/share/logstash/logstash core/lib/logstash/java_pipeline.rb:47:in
initialize', “/usr/share/logstash/logstash core/lib/logstash/pipeline_action/create.rb:52:in
execute'”/usr/share/logstash/logstash core/lib/logstash/agent.rb:365:in
block 处于会聚状态“]} 这是我的nginx.conf文件:

我发现了相似的答案,但没有帮助。 是否有人熟悉logstash语法并帮助找出我的错误


谢谢

您缺少一个}来关闭输入部分。请将其插入到筛选器关键字之前


另外,删除文件中的最后一个}。

谢谢,我以为过滤器在输入中
input{
    
   file{
   path => ["/var/log/nginx/access.log" , "/var/log/nginx/error.log"]
   type => "nginx"
   }
   filter{
   
   grok{
    match => ["message" , "%{COMBINEDAPACHELOG}+%{GREEDYDATA:extra_fields}"]
    overwrite => ["message"]
   }
   mutate{
    convert => ["response","integer"]
    convert => ["bytes","integer"]
    convert => ["responsetime","float"]
   }
   geoip{
    source => "clientip"
    target => "geoip"
    add_tag => ["nginx-geoip"]
   }
   date {
    match ⁼> ["timestamp" , "dd/MMM/YYYY:HH:mm:ss Z"]
    remove_field => ["timestamp"]
   }
   useragent {
   source => "agent"
   } 
   }

output{
 elasticsearch {
  hosts => ["localhost:9200"]
  index => "nginx-%{+yyyy.MM.dd}"
  document_type => "nginx_logs"
 }
}

}