elasticsearch 如何在kibana中查看日志,elasticsearch,logstash,kibana,elastic-stack,elasticsearch,Logstash,Kibana,Elastic Stack" /> elasticsearch 如何在kibana中查看日志,elasticsearch,logstash,kibana,elastic-stack,elasticsearch,Logstash,Kibana,Elastic Stack" />

elasticsearch 如何在kibana中查看日志

elasticsearch 如何在kibana中查看日志,elasticsearch,logstash,kibana,elastic-stack,elasticsearch,Logstash,Kibana,Elastic Stack,我是麋鹿新手,我使用net.logstash.logback.appender.logstashtcpsocketapender试用了带有springboot的麋鹿堆栈。我向logstack发送了json消息。以下是我的配置- logbackspring.xml <configuration> <include resource="org/springframework/boot/logging/logback/defaults.xml" /> ​ <s

我是麋鹿新手,我使用net.logstash.logback.appender.logstashtcpsocketapender试用了带有springboot的麋鹿堆栈。我向logstack发送了json消息。以下是我的配置-

logbackspring.xml

<configuration>
    <include resource="org/springframework/boot/logging/logback/defaults.xml" />
​   <springProperty scope="context" name="springAppName" source="spring.application.name" />

    <property name="LOG_FILE" value="./${springAppName}" />


    <property name="CONSOLE_LOG_PATTERN"
        value="%clr(%d{yyyy-MM-dd HH:mm:ss.SSS}){faint} %clr(${LOG_LEVEL_PATTERN:-%5p}) %clr(${PID:- }){magenta} %clr(---){faint} %clr([%15.15t]){faint} %clr(%-40.40logger{39}){cyan} %clr(:){faint} %m%n${LOG_EXCEPTION_CONVERSION_WORD:-%wEx}" />


    <appender name="logstash2"
        class="net.logstash.logback.appender.LogstashTcpSocketAppender">
        <destination>localhost:5000</destination>
        <encoder
            class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
            `
            <providers>
                <timestamp>
                    <timeZone>UTC</timeZone>
                </timestamp>
                <pattern>
                    <pattern>
                        {
                        "severity": "%level",
                        "service": "${springAppName:-}",
                        "trace": "%X{X-B3-TraceId:-}",
                        "span": "%X{X-B3-SpanId:-}",
                        "parent": "%X{X-B3-ParentSpanId:-}",
                        "exportable":
                        "%X{X-Span-Export:-}",
                        "pid": "${PID:-}",
                        "thread": "%thread",
                        "class": "%logger{40}",
                        "rest": "%message"
                        }
                    </pattern>
                </pattern>
            </providers>
        </encoder>
        <keepAliveDuration>5 minutes</keepAliveDuration>
    </appender>
    ​
    <root level="INFO">
        <appender-ref ref="logstash" />
    </root>
</configuration>
但当我打开kibana查看消息时,我会将整个日志视为消息。 如下-

有人能帮我实现以下目标吗


您的过滤块应如下所示:

filter {
       # pattern matching logback pattern
       grok {
              match => { "message" => "%{TIMESTAMP_ISO8601:timestamp}\s+%{LOGLEVEL:severity}\s+\[%{DATA:service},%{DATA:trace},%{DATA:span},%{DATA:exportable}\]\s+%{DATA:pid}\s+---\s+\[%{DATA:thread}\]\s+%{DATA:class}\s+:\s+%{GREEDYDATA:rest}" }
       }
       json{
              source => "message"
       }
}
output {
    elasticsearch { 
         hosts => ["localhost:9200"] 
         index => "YOUR_INDEX_NAME-%{+YYYY.MM.dd}"
    }
} 
我不明白为什么在输出块中不使用索引命名?如果有多个索引,则会遇到问题。添加如下内容:

filter {
       # pattern matching logback pattern
       grok {
              match => { "message" => "%{TIMESTAMP_ISO8601:timestamp}\s+%{LOGLEVEL:severity}\s+\[%{DATA:service},%{DATA:trace},%{DATA:span},%{DATA:exportable}\]\s+%{DATA:pid}\s+---\s+\[%{DATA:thread}\]\s+%{DATA:class}\s+:\s+%{GREEDYDATA:rest}" }
       }
       json{
              source => "message"
       }
}
output {
    elasticsearch { 
         hosts => ["localhost:9200"] 
         index => "YOUR_INDEX_NAME-%{+YYYY.MM.dd}"
    }
}