日志存储输出无法识别kibana中的列

日志存储输出无法识别kibana中的列,kibana,Kibana,我正试图在Kibana中获取我的.CSV文件以便可视化。我感觉我很快就可以让它工作了,但我不知道如何让我的输出正确 在Kibana中,我看到的.csv文件如下: message: News,test@email.com,10.10.10.10 看起来我的CSV输出在一个名为message的字段中。我想得到3个不同的领域:名称,电子邮件,IP。我尝试了很多csv文件和不同的代码,但还没有成功 CSV文件: Name,Email,IP Auto,auto@newsuk,10.0.0.196 New

我正试图在Kibana中获取我的.CSV文件以便可视化。我感觉我很快就可以让它工作了,但我不知道如何让我的输出正确

在Kibana中,我看到的.csv文件如下:

message: News,test@email.com,10.10.10.10
看起来我的CSV输出在一个名为message的字段中。我想得到3个不同的领域:名称,电子邮件,IP。我尝试了很多csv文件和不同的代码,但还没有成功

CSV文件:

Name,Email,IP
Auto,auto@newsuk,10.0.0.196
News,test@email.com,10.10.10.10
nieuwsbrieven,nieuwsbrieven@nl,10.10.10.10
配置文件:

input {
file {
path => "C:\Users\JOEY2\Desktop\Deelproblemen\Applicatie\Output\test.csv"
start_position => beginning
sincedb_path => "/dev/null"
}}

filter {
csv {
separator => ","
columns => ["Date","Open","High"]
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "csv_index"
}
stdout {}
}
filebeat.yml filebeat.prospectors:

- input_type: log
paths:
- C:\Users\JOEY2\Desktop\Deelproblemen\Applicatie\Output\test.csv

output.elasticsearch:
hosts: ["localhost:9200"]
template.name: "testttt"
template.overwrite: true

output.logstash:
hosts: ["localhost:5044"]
Logstash CMD输出:

[2017-10-12T13:53:52,682][INFO ][logstash.pipeline        ] Pipeline main started
[2017-10-12T13:53:52,690][INFO ][org.logstash.beats.Server] Starting server on port: 5044
[2017-10-12T13:53:53,003][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
{
    "@timestamp" => 2017-10-12T11:53:53.659Z,
        "offset" => 15,
      "@version" => "1",
    "input_type" => "log",
          "beat" => {
            "name" => "DESKTOP-VEQHHVT",
        "hostname" => "DESKTOP-VEQHHVT",
         "version" => "5.6.2"
    },
          "host" => "DESKTOP-VEQHHVT",
        "source" => "C:\\Users\\JOEY2\\Desktop\\Deelproblemen\\Applicatie\\Output\\test.csv",
       "message" => "Name,Email,IP",
          "type" => "log",
          "tags" => [
        [0] "beats_input_codec_plain_applied"
    ]
}
{
    "@timestamp" => 2017-10-12T11:53:53.659Z,
        "offset" => 44,
      "@version" => "1",
    "input_type" => "log",
          "beat" => {
            "name" => "DESKTOP-VEQHHVT",
        "hostname" => "DESKTOP-VEQHHVT",
         "version" => "5.6.2"
    },
          "host" => "DESKTOP-VEQHHVT",
        "source" => "C:\\Users\\JOEY2\\Desktop\\Deelproblemen\\Applicatie\\Output\\test.csv",
       "message" => "Auto,auto@newsuk,10.0.0.196",
          "type" => "log",
          "tags" => [
        [0] "beats_input_codec_plain_applied"
    ]
}
{
    "@timestamp" => 2017-10-12T11:53:53.659Z,
        "offset" => 77,
      "@version" => "1",
          "beat" => {
            "name" => "DESKTOP-VEQHHVT",
        "hostname" => "DESKTOP-VEQHHVT",
         "version" => "5.6.2"
    },
    "input_type" => "log",
          "host" => "DESKTOP-VEQHHVT",
        "source" => "C:\\Users\\JOEY2\\Desktop\\Deelproblemen\\Applicatie\\Output\\test.csv",
       "message" => "News,test@email.com,10.10.10.10",
          "type" => "log",
          "tags" => [
        [0] "beats_input_codec_plain_applied"
    ]
所有我的CSV列/行都在变量消息中

Curl命令输出:(Curl-s localhost:9200/_cat/index?v)

终端ELAC输出:

[2017-10-12T13:53:11,763][INFO ][o.e.n.Node               ] [] initializing ...
[2017-10-12T13:53:11,919][INFO ][o.e.e.NodeEnvironment    ] [Zs6ZAuy] using [1] data paths, mounts [[(C:)]], net usable_space [1.9tb], net total_space [1.9tb], spins? [unknown], types [NTFS]
[2017-10-12T13:53:11,920][INFO ][o.e.e.NodeEnvironment    ] [Zs6ZAuy] heap size [1.9gb], compressed ordinary object pointers [true]
[2017-10-12T13:53:12,126][INFO ][o.e.n.Node               ] node name [Zs6ZAuy] derived from node ID [Zs6ZAuyyR2auGVnPoD9gRw]; set [node.name] to override
[2017-10-12T13:53:12,128][INFO ][o.e.n.Node               ] version[5.6.2], pid[3384], build[57e20f3/2017-09-23T13:16:45.703Z], OS[Windows 10/10.0/amd64], JVM[Oracle Corporation/Java HotSpot(TM) 64-Bit Server VM/1.8.0_144/25.144-b01]
[2017-10-12T13:53:12,128][INFO ][o.e.n.Node               ] JVM arguments [-Xms2g, -Xmx2g, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -XX:+AlwaysPreTouch, -Xss1m, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djna.nosys=true, -Djdk.io.permissionsUseCanonicalPath=true, -Dio.netty.noUnsafe=true, -Dio.netty.noKeySetOptimization=true, -Dio.netty.recycler.maxCapacityPerThread=0, -Dlog4j.shutdownHookEnabled=false, -Dlog4j2.disable.jmx=true, -Dlog4j.skipJansi=true, -XX:+HeapDumpOnOutOfMemoryError, -Delasticsearch, -Des.path.home=C:\ELK-Stack\elasticsearch\elasticsearch-5.6.2]
[2017-10-12T13:53:13,550][INFO ][o.e.p.PluginsService     ] [Zs6ZAuy] loaded module [aggs-matrix-stats]
[2017-10-12T13:53:13,616][INFO ][o.e.p.PluginsService     ] [Zs6ZAuy] loaded module [ingest-common]
[2017-10-12T13:53:13,722][INFO ][o.e.p.PluginsService     ] [Zs6ZAuy] loaded module [lang-expression]
[2017-10-12T13:53:13,798][INFO ][o.e.p.PluginsService     ] [Zs6ZAuy] loaded module [lang-groovy]
[2017-10-12T13:53:13,886][INFO ][o.e.p.PluginsService     ] [Zs6ZAuy] loaded module [lang-mustache]
[2017-10-12T13:53:13,988][INFO ][o.e.p.PluginsService     ] [Zs6ZAuy] loaded module [lang-painless]
[2017-10-12T13:53:14,059][INFO ][o.e.p.PluginsService     ] [Zs6ZAuy] loaded module [parent-join]
[2017-10-12T13:53:14,154][INFO ][o.e.p.PluginsService     ] [Zs6ZAuy] loaded module [percolator]
[2017-10-12T13:53:14,223][INFO ][o.e.p.PluginsService     ] [Zs6ZAuy] loaded module [reindex]
[2017-10-12T13:53:14,289][INFO ][o.e.p.PluginsService     ] [Zs6ZAuy] loaded module [transport-netty3]
[2017-10-12T13:53:14,360][INFO ][o.e.p.PluginsService     ] [Zs6ZAuy] loaded module [transport-netty4]
[2017-10-12T13:53:14,448][INFO ][o.e.p.PluginsService     ] [Zs6ZAuy] no plugins loaded
[2017-10-12T13:53:18,328][INFO ][o.e.d.DiscoveryModule    ] [Zs6ZAuy] using discovery type [zen]
[2017-10-12T13:53:19,204][INFO ][o.e.n.Node               ] initialized
[2017-10-12T13:53:19,204][INFO ][o.e.n.Node               ] [Zs6ZAuy] starting ...
[2017-10-12T13:53:20,071][INFO ][o.e.t.TransportService   ] [Zs6ZAuy] publish_address {127.0.0.1:9300}, bound_addresses {127.0.0.1:9300}, {[::1]:9300}
[2017-10-12T13:53:23,130][INFO ][o.e.c.s.ClusterService   ] [Zs6ZAuy] new_master {Zs6ZAuy}{Zs6ZAuyyR2auGVnPoD9gRw}{jBwTE7rUS4i_Ugh6k6DAMg}{127.0.0.1}{127.0.0.1:9300}, reason: zen-disco-elected-as-master ([0] nodes joined)
[2017-10-12T13:53:23,883][INFO ][o.e.g.GatewayService     ] [Zs6ZAuy] recovered [5] indices into cluster_state
[2017-10-12T13:53:25,962][INFO ][o.e.c.r.a.AllocationService] [Zs6ZAuy] Cluster health status changed from [RED] to [YELLOW] (reason: [shards started [[.kibana][0]] ...]).
[2017-10-12T13:53:25,981][INFO ][o.e.h.n.Netty4HttpServerTransport] [Zs6ZAuy] publish_address {127.0.0.1:9200}, bound_addresses {127.0.0.1:9200}, {[::1]:9200}
[2017-10-12T13:53:25,986][INFO ][o.e.n.Node               ] [Zs6ZAuy] started
[2017-10-12T13:53:59,245][INFO ][o.e.c.m.MetaDataCreateIndexService] [Zs6ZAuy] [filebeat-2017.10.12] creating index, cause [auto(bulk api)], templates [filebeat, testttt], shards [5]/[1], mappings [_default_]
[2017-10-12T13:53:59,721][INFO ][o.e.c.m.MetaDataMappingService] [Zs6ZAuy] [filebeat-2017.10.12/ux6-ByOERj-2XEBojkxhXg] create_mapping [doc]
Filebeat输出:

C:\ELK-Stack\filebeat>filebeat -e -c filebeat.yml -d "publish"
2017/10/12 11:53:53.632142 beat.go:297: INFO Home path: [C:\ELK-Stack\filebeat] Config path: [C:\ELK-Stack\filebeat] Data path: [C:\ELK-Stack\filebeat\data] Logs path: [C:\ELK-Stack\filebeat\logs]
2017/10/12 11:53:53.632142 beat.go:192: INFO Setup Beat: filebeat; Version: 5.6.2
2017/10/12 11:53:53.634143 publish.go:228: WARN Support for loading more than one output is deprecated and will not be supported in version 6.0.
2017/10/12 11:53:53.635144 output.go:258: INFO Loading template enabled. Reading template file: C:\ELK-Stack\filebeat\filebeat.template.json
2017/10/12 11:53:53.636144 output.go:269: INFO Loading template enabled for Elasticsearch 2.x. Reading template file: C:\ELK-Stack\filebeat\filebeat.template-es2x.json
2017/10/12 11:53:53.637143 output.go:281: INFO Loading template enabled for Elasticsearch 6.x. Reading template file: C:\ELK-Stack\filebeat\filebeat.template-es6x.json
2017/10/12 11:53:53.638144 client.go:128: INFO Elasticsearch url: http://localhost:9200
2017/10/12 11:53:53.639143 outputs.go:108: INFO Activated elasticsearch as output plugin.
2017/10/12 11:53:53.639143 logstash.go:90: INFO Max Retries set to: 3
2017/10/12 11:53:53.640143 outputs.go:108: INFO Activated logstash as output plugin.
2017/10/12 11:53:53.640143 publish.go:243: DBG  Create output worker
2017/10/12 11:53:53.641143 publish.go:243: DBG  Create output worker
2017/10/12 11:53:53.641143 publish.go:285: DBG  No output is defined to store the topology. The server fields might not be filled.
2017/10/12 11:53:53.642144 publish.go:300: INFO Publisher name: DESKTOP-VEQHHVT
2017/10/12 11:53:53.634143 metrics.go:23: INFO Metrics logging every 30s
2017/10/12 11:53:53.646143 async.go:63: INFO Flush Interval set to: 1s
2017/10/12 11:53:53.647142 async.go:64: INFO Max Bulk Size set to: 50
2017/10/12 11:53:53.647142 async.go:72: DBG  create bulk processing worker (interval=1s, bulk size=50)
2017/10/12 11:53:53.648144 async.go:63: INFO Flush Interval set to: 1s
2017/10/12 11:53:53.648144 async.go:64: INFO Max Bulk Size set to: 2048
2017/10/12 11:53:53.649144 async.go:72: DBG  create bulk processing worker (interval=1s, bulk size=2048)
2017/10/12 11:53:53.649144 beat.go:233: INFO filebeat start running.
2017/10/12 11:53:53.650144 registrar.go:68: INFO No registry file found under: C:\ELK-Stack\filebeat\data\registry. Creating a new registry file.
2017/10/12 11:53:53.652144 registrar.go:106: INFO Loading registrar data from C:\ELK-Stack\filebeat\data\registry
2017/10/12 11:53:53.654145 registrar.go:123: INFO States Loaded from registrar: 0
2017/10/12 11:53:53.655145 crawler.go:38: INFO Loading Prospectors: 1
2017/10/12 11:53:53.655145 prospector_log.go:65: INFO Prospector with previous states loaded: 0
2017/10/12 11:53:53.656144 prospector.go:124: INFO Starting prospector of type: log; id: 11034545279404679229
2017/10/12 11:53:53.656144 crawler.go:58: INFO Loading and starting Prospectors completed. Enabled prospectors: 1
2017/10/12 11:53:53.655145 spooler.go:63: INFO Starting spooler: spool_size: 2048; idle_timeout: 5s
2017/10/12 11:53:53.655145 registrar.go:236: INFO Starting Registrar
2017/10/12 11:53:53.657144 log.go:91: INFO Harvester started for file: C:\Users\JOEY2\Desktop\Deelproblemen\Applicatie\Output\test.csv
2017/10/12 11:53:53.655145 sync.go:41: INFO Start sending events to output
2017/10/12 11:53:58.682432 client.go:214: DBG  Publish: {
  "@timestamp": "2017-10-12T11:53:53.659Z",
  "beat": {
    "hostname": "DESKTOP-VEQHHVT",
    "name": "DESKTOP-VEQHHVT",
    "version": "5.6.2"
  },
  "input_type": "log",
  "message": "Name,Email,IP",
  "offset": 15,
  "source": "C:\\Users\\JOEY2\\Desktop\\Deelproblemen\\Applicatie\\Output\\test.csv",
  "type": "log"
}
2017/10/12 11:53:58.685434 client.go:214: DBG  Publish: {
  "@timestamp": "2017-10-12T11:53:53.659Z",
  "beat": {
    "hostname": "DESKTOP-VEQHHVT",
    "name": "DESKTOP-VEQHHVT",
    "version": "5.6.2"
  },
  "input_type": "log",
  "message": "Auto,auto@newsuk,10.0.0.196",
  "offset": 44,
  "source": "C:\\Users\\JOEY2\\Desktop\\Deelproblemen\\Applicatie\\Output\\test.csv",
  "type": "log"
}
2017/10/12 11:53:58.685434 client.go:214: DBG  Publish: {
  "@timestamp": "2017-10-12T11:53:53.659Z",
  "beat": {
    "hostname": "DESKTOP-VEQHHVT",
    "name": "DESKTOP-VEQHHVT",
    "version": "5.6.2"
  },
  "input_type": "log",
  "message": "News,test@email.com,10.10.10.10",
  "offset": 77,
  "source": "C:\\Users\\JOEY2\\Desktop\\Deelproblemen\\Applicatie\\Output\\test.csv",
  "type": "log"
}
2017/10/12 11:53:58.686434 output.go:109: DBG  output worker: publish 3 events
2017/10/12 11:53:58.686434 output.go:109: DBG  output worker: publish 3 events
2017/10/12 11:53:58.738437 client.go:667: INFO Connected to Elasticsearch version 5.6.2
2017/10/12 11:53:58.748436 output.go:317: INFO Trying to load template for client: http://localhost:9200
2017/10/12 11:53:58.890446 output.go:324: INFO Existing template will be overwritten, as overwrite is enabled.
2017/10/12 11:53:59.154461 client.go:592: INFO Elasticsearch template with name 'testttt' loaded
2017/10/12 11:54:00.020510 sync.go:70: DBG  Events sent: 4
Kibana输出:

@timestamp:October 12th 2017, 13:53:53.659 beat.hostname:DESKTOP-VEQHHVT beat.name:DESKTOP-VEQHHVT beat.version:5.6.2 input_type:log message:Auto,auto@newsuk,10.0.0.196 offset:44 source:C:\Users\JOEY2\Desktop\Deelproblemen\Applicatie\Output\test.csv type:log _id:AV8QbyIcTtSiVplm9CwA _type:doc _index:filebeat-2017.10.12 _score:1
@timestamp:October 12th 2017, 13:53:53.659 beat.hostname:DESKTOP-VEQHHVT beat.name:DESKTOP-VEQHHVT beat.version:5.6.2 input_type:log message:News,test@email.com,10.10.10.10 offset:77 source:C:\Users\JOEY2\Desktop\Deelproblemen\Applicatie\Output\test.csv type:log _id:AV8QbyIcTtSiVplm9CwB _type:doc _index:filebeat-2017.10.12 _score:1
@timestamp:October 12th 2017, 13:53:53.659 beat.hostname:DESKTOP-VEQHHVT beat.name:DESKTOP-VEQHHVT beat.version:5.6.2 input_type:log message:Name,Email,IP offset:15 source:C:\Users\JOEY2\Desktop\Deelproblemen\Applicatie\Output\test.csv type:log _id:AV8QbyIcTtSiVplm9Cv_ _type:doc _index:filebeat-2017.10.12 _score:1

您在csv筛选器中提供了错误的列名,列名应不带双引号(“)。 我已经尝试过这个,它对我有效。检查它是否对你有效。我的日志存储配置文件:

input {
    file {
        path => "/home/quality/Desktop/work/csv.csv"
        start_position => "beginning"
        sincedb_path => "/dev/null"
    }

}

filter {
    csv {
        separator => ","
        columns => [Name,Email,IP]
    }
}

output {
    elasticsearch {
        hosts => "localhost"
        index => "csv"
        document_type => "csv"
    }

    stdout { codec => rubydebug}
}

如果csv导入ES是一次性任务,您不需要filebeat。感谢您的回复。很抱歉,这是另一个示例中的错误。当我运行您的脚本时,问题仍然存在。我仍然在kibana中看到错误的输出。发现中的一个条目是:它看起来没有重新编码我的列。剩余的.csv文件是名称。字段名:消息包含所有数据我希望在不同的字段中获得所有3个值,以便我可以使用它们进行可视化。
感谢您的时间。抱歉@Hatim,会议召开了。Elec:number:“5.6.2”,Logstash:5.6.2抱歉,但您对ES文档的意思是什么?ES:ElasticSearch文档。运行logstash后获取索引的文档。对不起,我觉得很愚蠢,但你是指正在处理的CSV文件?我现在只测试问题中显示的3个(CSV)条目。
input {
    file {
        path => "/home/quality/Desktop/work/csv.csv"
        start_position => "beginning"
        sincedb_path => "/dev/null"
    }

}

filter {
    csv {
        separator => ","
        columns => [Name,Email,IP]
    }
}

output {
    elasticsearch {
        hosts => "localhost"
        index => "csv"
        document_type => "csv"
    }

    stdout { codec => rubydebug}
}