Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/341.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java Logstash TCP input w/JSON编解码器将每一行视为一个单独的事件_Java_Json_Logstash - Fatal编程技术网

Java Logstash TCP input w/JSON编解码器将每一行视为一个单独的事件

Java Logstash TCP input w/JSON编解码器将每一行视为一个单独的事件,java,json,logstash,Java,Json,Logstash,我试图使用Logstash JSON编解码器通过Logstash TCP套接字读取log4j v2.3 JSON输出,但Logstash将每一行作为单独的事件进行索引,而不是将每一个JSON对象作为事件读取 log4j配置 <Appenders> <Console name="console" target="SYSTEM_OUT"> <PatternLayout pattern="%d %p [%c] - &lt;%m&gt

我试图使用Logstash JSON编解码器通过Logstash TCP套接字读取log4j v2.3 JSON输出,但Logstash将每一行作为单独的事件进行索引,而不是将每一个JSON对象作为事件读取

log4j配置

<Appenders>
    <Console name="console" target="SYSTEM_OUT">
        <PatternLayout pattern="%d %p [%c] - &lt;%m&gt;%n"/>
    </Console>
    ... removed for brevity ...
    <Socket name="logstash" host="localhost" port="4560">
      <JSONLayout />
    </Socket>
</Appenders>
<Loggers>
    <Logger name="org.jasig" level="info" additivity="false">
        <AppenderRef ref="console"/>
        <AppenderRef ref="file"/>
        <AppenderRef ref="logstash"/>
    </Logger>
    ... removed for brevity ...
    <Root level="error">
        <AppenderRef ref="console"/>
        <AppenderRef ref="logstash"/>
    </Root>
</Loggers>
input {
  tcp {
      port => 4560
      codec => json
  }
}
output {
  elasticsearch {}
  stdout {}
}
input {
  tcp {
      port => 4560
      codec => multiline {
        pattern => "^\{$"
        negate => true
        what => previous
      }  
  }
}

filter {
  json { source => message }
}

output {
  elasticsearch {}
  stdout {}
}
日志存储输出

每一行都被解析为一个单独的JSON对象,而不是将整个JSON对象视为一个事件

2016-03-22T01:24:27.213Z 127.0.0.1 {
2016-03-22T01:24:27.215Z 127.0.0.1   "timeMillis" : 1458609867060,
2016-03-22T01:24:27.216Z 127.0.0.1   "thread" : "localhost-startStop-1",
2016-03-22T01:24:27.217Z 127.0.0.1   "level" : "INFO",
2016-03-22T01:24:27.218Z 127.0.0.1   "loggerName" : "com.hazelcast.instance.DefaultAddressPicker",
2016-03-22T01:24:27.219Z 127.0.0.1   "message" : "[LOCAL] [dev] [3.5] Resolving domain name 'wozniak.local' to address(es): [192.168.0.16, fe80:0:0:0:6203:8ff:fe89:6d3a%4]\n",
2016-03-22T01:24:27.220Z 127.0.0.1   "endOfBatch" : false,
2016-03-22T01:24:27.221Z 127.0.0.1   "loggerFqcn" : "org.apache.logging.slf4j.Log4jLogger"
2016-03-22T01:24:27.222Z 127.0.0.1 }
2016-03-22T01:24:32.281Z 127.0.0.1 {
2016-03-22T01:24:32.283Z 127.0.0.1   "timeMillis" : 1458609872279,
2016-03-22T01:24:32.286Z 127.0.0.1   "thread" : "localhost-startStop-1",
2016-03-22T01:24:32.287Z 127.0.0.1   "level" : "WARN",
2016-03-22T01:24:32.289Z 127.0.0.1   "loggerName" : "com.hazelcast.instance.DefaultAddressPicker",
2016-03-22T01:24:32.294Z 127.0.0.1   "message" : "[LOCAL] [dev] [3.5] Cannot resolve hostname: 'Jons-MacBook-Pro-2.local'\n",
2016-03-22T01:24:32.299Z 127.0.0.1   "endOfBatch" : false,
2016-03-22T01:24:32.302Z 127.0.0.1   "loggerFqcn" : "org.apache.logging.slf4j.Log4jLogger"
2016-03-22T01:24:32.307Z 127.0.0.1 }

提前感谢您的帮助。

好吧,我让它工作起来了。这不是我想要解决它的方式,但它确实有效

我没有使用
json
编解码器,而是使用
multiline
编解码器进行输入,并使用
json
过滤器

日志存储配置

<Appenders>
    <Console name="console" target="SYSTEM_OUT">
        <PatternLayout pattern="%d %p [%c] - &lt;%m&gt;%n"/>
    </Console>
    ... removed for brevity ...
    <Socket name="logstash" host="localhost" port="4560">
      <JSONLayout />
    </Socket>
</Appenders>
<Loggers>
    <Logger name="org.jasig" level="info" additivity="false">
        <AppenderRef ref="console"/>
        <AppenderRef ref="file"/>
        <AppenderRef ref="logstash"/>
    </Logger>
    ... removed for brevity ...
    <Root level="error">
        <AppenderRef ref="console"/>
        <AppenderRef ref="logstash"/>
    </Root>
</Loggers>
input {
  tcp {
      port => 4560
      codec => json
  }
}
output {
  elasticsearch {}
  stdout {}
}
input {
  tcp {
      port => 4560
      codec => multiline {
        pattern => "^\{$"
        negate => true
        what => previous
      }  
  }
}

filter {
  json { source => message }
}

output {
  elasticsearch {}
  stdout {}
}
这是正确索引的输出

2016-03-22T09:42:26.880Z 127.0.0.1 0 expired tickets found to be removed.
2016-03-22T09:43:26.992Z 127.0.0.1 Finished ticket cleanup.
2016-03-22T09:43:47.120Z 127.0.0.1 Setting path for cookies to: /cas/ 
2016-03-22T09:43:47.122Z 127.0.0.1 AcceptUsersAuthenticationHandler successfully authenticated hashbrowns+password
2016-03-22T09:43:47.131Z 127.0.0.1 Authenticated hashbrowns with credentials [hashbrowns+password].
2016-03-22T09:43:47.186Z 127.0.0.1 Audit trail record BEGIN
=============================================================
WHO: hashbrowns+password
WHAT: supplied credentials: [hashbrowns+password]
ACTION: AUTHENTICATION_SUCCESS
APPLICATION: CAS
WHEN: Tue Mar 22 05:43:47 EDT 2016
CLIENT IP ADDRESS: 0:0:0:0:0:0:0:1
SERVER IP ADDRESS: 0:0:0:0:0:0:0:1
=============================================================
这似乎有点脆弱,因为它依赖于log4j格式化json的方式,所以我仍然很想知道如何让
json
codec处理多行json输出