Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/json/14.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Logstash未推送json格式文件_Json_Logstash - Fatal编程技术网

Logstash未推送json格式文件

Logstash未推送json格式文件,json,logstash,Json,Logstash,我是Elasticsearch的新手,Kibana et Logstash。我正在尝试加载一个json文件,如下所示: {"timestamp":"2014-05-19T00:00:00.430Z","memoryUsage":42.0,"totalMemory":85.74,"usedMemory":78.77,"cpuUsage":26.99,"monitoringType":"jvmHealth"} {"timestamp":"2014-05-19T00:09:10.431Z","memo

我是Elasticsearch的新手,Kibana et Logstash。我正在尝试加载一个json文件,如下所示:

{"timestamp":"2014-05-19T00:00:00.430Z","memoryUsage":42.0,"totalMemory":85.74,"usedMemory":78.77,"cpuUsage":26.99,"monitoringType":"jvmHealth"}
{"timestamp":"2014-05-19T00:09:10.431Z","memoryUsage":43.0,"totalMemory":85.74,"usedMemory":78.77,"cpuUsage":26.99,"monitoringType":"jvmHealth"}
{"timestamp":"2014-05-19T00:09:10.441Z","transactionTime":1,"nbAddedObjects":0,"nbRemovedObjects":0,"monitoringType":"transactions"}
{"timestamp":"2014-05-19T00:09:10.513Z","transactionTime":6,"nbAddedObjects":4,"nbRemovedObjects":0,"monitoringType":"transactions"}
没有创建索引,我只收到以下消息:

使用里程碑2输入插件“文件”。这个插件应该是稳定的, 但是如果你看到奇怪的行为,请告诉我们!更多 有关插件里程碑的信息,请参阅 {:level=>:warn}

有什么问题吗?我可以直接使用散装,但我必须使用logstash。 你有什么可以帮助你的建议代码吗

编辑以将配置从注释移动到问题:

input {
    file {
        path => "/home/ndoye/Elasticsearch/great_log.json"
        type => json
        codec => json
    }
}

filter {
    date {
        match => ["timestamp","yyyy-MM-dd HH:mm:ss.SSS"]
    }
}

output {
    stdout{
        #codec => rubydebug
    }
    elasticsearch {
        embedded => true
    }
} 

听起来您需要完全按照消息建议执行操作。谢谢Robert,但需要执行的操作没有明确说明。如果您看到奇怪的行为,请告诉我们。让我们知道您如何尝试加载json文件。您是否在elasticsearch中启用了自动创建索引?您是否配置了日志存储?请提供更多信息。谢谢罗伯特和迪格。这是我使用的代码。输入{file{path=>/home/ndoye/Elasticsearch/great_log.json type=>json codec=>json}过滤器{date{match=>[timestamp,yyyyy-MM-dd HH:MM:ss.SSS]}输出{stdout{codec=>rubydebug}Elasticsearch{embedded=>true}