elasticsearch Logstash未使用docker-compose.yaml文件读取文件输入,elasticsearch,docker-compose,logstash,kibana,elasticsearch,Docker Compose,Logstash,Kibana" /> elasticsearch Logstash未使用docker-compose.yaml文件读取文件输入,elasticsearch,docker-compose,logstash,kibana,elasticsearch,Docker Compose,Logstash,Kibana" />

elasticsearch Logstash未使用docker-compose.yaml文件读取文件输入

elasticsearch Logstash未使用docker-compose.yaml文件读取文件输入,elasticsearch,docker-compose,logstash,kibana,elasticsearch,Docker Compose,Logstash,Kibana,logstash.conf文件: input { file { type => "java" path => "/elk/spring-boot-elk.log" start_position => "beginning" } } filter { #If log line contains tab character followed by 'at' then we will tag that entry as stacktrace

logstash.conf文件:

 input {

 file {
    type => "java"
    path => "/elk/spring-boot-elk.log"
    start_position => "beginning"
  }
}


filter {
  #If log line contains tab character followed by 'at' then we will tag that entry as 
stacktrace
  if [message] =~ "\tat" {
    grok {
      match => ["message", "^(\tat)"]
      add_tag => ["stacktrace"]
    }
  }

}

output { 
  stdout {
    codec => rubydebug
  }

  # Sending properly parsed log events to elasticsearch
  elasticsearch {
    hosts => ["elasticsearch:9200"]
  }
}
docker compose.yaml
文件:

version: "3"
services:
  elasticsearch:
    image: elasticsearch:7.5.2
    ports: 
    - "9200:9200"
    - "9300:9300"
    environment:
    - discovery.type=single-node
  kibana:
    image: kibana:7.5.2
    ports:
    - "5601:5601"
    links:
    - elasticsearch
    depends_on:
    - elasticsearch
  logstash:
    image: logstash:7.5.2
    links:
    - elasticsearch
    volumes:
    - ./:/config-dir
    command: logstash -f /config-dir/logstash.conf
    depends_on:
    - elasticsearch
所有容器都在运行,但
kibana
上没有数据

我认为问题在于
logstash

[2020-04-26T16:37:44502][WARN][logstash.outputs.elasticsearch]您正在使用在elasticsearch中设置的不推荐的配置设置“document\u type”。不推荐使用的设置将继续工作,但计划将来从日志库中删除。Elasticsearch 6.0不推荐使用文档类型,而在7.0中完全删除了文档类型。如果对此有任何疑问,请访问freenode irc上的#logstash频道,避免使用此功能。{:name=>“document\u type”,:plugin=>“/\u monitoring/bulk?system\u id=logstash&system\u api\u version=7&interval=1s”,hosts=>[http://elasticsearch:9200],嗅探=>false,管理_模板=>false,id=>“7d7dfa0f023f65240aeb31ebb353da5a42dc782979a2bd7e26e28b7cbd509bb3”,文档类型=>“%{[@metadata][文档类型]}”,启用_度量=>true,编解码器=>“plain_1a08e50c-ae97-4f38-a5b7-7aa70df94f4a”,enable_metric=>true,charset=>“UTF-8”>,workers=>1,template_name=>“logstash”,template_overwrite=>false,doc_as_upsert=>false,script_type=>“inline”,script_lang=>“无痛”,script_var_name=>“事件“,脚本化的\u upsert=>false,重试\u初始\u间隔=>2,重试\u最大\u间隔=>64,重试\u冲突=>1,ilm\u启用=>“自动”,ilm\u滚动\u别名=>“日志存储”,ilm\u模式=>“{now/d}-000001”,ilm\u策略=>“日志存储策略”,操作=>“索引”,ssl\u证书\u验证=>true,嗅探\u延迟=>5,超时=>60,池\u最大值=>1000,池\u最大值每\u路由=>100,恢复\u延迟=>5,在\u不活动=>10000后验证\u,http\u压缩=>false>}
[2020-04-26T16:37:44544][INFO][logstash.outputs.elasticsearch]elasticsearch池URL更新{:更改=>{:删除=>[],:添加=>[http://elasticsearch:9200/]}}
[2020-04-26T16:37:44550][WARN][logstash.outputs.elasticsearch]已恢复与ES实例的连接{:url=>”http://elasticsearch:9200/"}
[2020-04-26T16:37:44555][INFO][logstash.outputs.elasticsearch]ES输出版本已确定{:ES_version=>7}
[2020-04-26T16:37:44555][WARN][logstash.outputs.elasticsearch]检测到6.x及更高版本的群集:“type”事件字段将不用于确定文档类型{:es\u version=>7}
[2020-04-26T16:37:44586][INFO][logstash.outputs.elasticsearch]新的elasticsearch输出{:class=>“logstash::outputs::elasticsearch”,:hosts=>[”http://elasticsearch:9200"]}
[2020-04-26T16:37:44597][INFO][logstash.javapipeline]正在启动管道{:pipeline\u id=>”。监视logstash,“pipeline.workers”=>1,“pipeline.batch.size”=>2,“pipeline.batch.delay”=>50,“pipeline.max\u inflight”=>2,“pipepeline.sources”=>[“监控管道”],:thread=>
[2020-04-26T16:37:44636][INFO][logstash.javapipeline]管道已启动{“Pipeline.id=>”。正在监视logstash}
[2020-04-26T16:37:44654][INFO][logstash.agent]正在运行的管道{:计数=>2,:正在运行的管道=>
[2020-04-26T16:37:44899][INFO][logstash.agent]已成功启动logstash API端点{:port=>9600}

在默认模式下,“tail”logstash希望文件不断写入。因此,您可能希望在向文件添加新行时测试配置。或者切换到读取模式。在此模式下,您还必须设置一些其他必需的设置。请参阅以获取参考。在默认模式下,“tail”logstash希望文件不断写入。因此,您可能希望在向文件添加新行时测试配置。或者切换到读取模式。在此模式下,您还必须设置一些其他必需的设置。请参阅以供参考。
[2020-04-26T16:37:44,502][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch bulk_path=>"/_monitoring/bulk?system_id=logstash&system_api_version=7&interval=1s", hosts=>[http://elasticsearch:9200], sniffing=>false, manage_template=>false, id=>"7d7dfa0f023f65240aeb31ebb353da5a42dc782979a2bd7e26e28b7cbd509bb3", document_type=>"%{[@metadata][document_type]}", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_1a08e50c-ae97-4f38-a5b7-7aa70df94f4a", enable_metric=>true, charset=>"UTF-8">, workers=>1, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, ilm_enabled=>"auto", ilm_rollover_alias=>"logstash", ilm_pattern=>"{now/d}-000001", ilm_policy=>"logstash-policy", action=>"index", ssl_certificate_verification=>true, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2020-04-26T16:37:44,544][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elasticsearch:9200/]}}
[2020-04-26T16:37:44,550][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://elasticsearch:9200/"}
[2020-04-26T16:37:44,555][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2020-04-26T16:37:44,555][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-04-26T16:37:44,586][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://elasticsearch:9200"]}
[2020-04-26T16:37:44,597][INFO ][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>2, "pipeline.sources"=>["monitoring pipeline"], :thread=>"#<Thread:0x61a1055 run>"}
[2020-04-26T16:37:44,636][INFO ][logstash.javapipeline    ] Pipeline started {"pipeline.id"=>".monitoring-logstash"}
[2020-04-26T16:37:44,654][INFO ][logstash.agent           ] Pipelines running {:count=>2, :running_pipelines=>[:main, :".monitoring-logstash"], :non_running_pipelines=>[]}
[2020-04-26T16:37:44,899][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}