Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/332.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
使用python将日志文件转换为json文件_Python_Json_Python 3.x - Fatal编程技术网

使用python将日志文件转换为json文件

使用python将日志文件转换为json文件,python,json,python-3.x,Python,Json,Python 3.x,我是python新手。我正在尝试使用python脚本将日志文件转换为json文件。我创建了一个主文件和一个del6文件。在这里,它将转换日志文件并写入一个新的json文件。在执行时,它向我显示以下错误 Traceback (most recent call last): File "main.py", line 23, in <module> main() File "main.py", line 14, in main print toJson(sys.ar

我是python新手。我正在尝试使用python脚本将日志文件转换为json文件。我创建了一个主文件和一个del6文件。在这里,它将转换日志文件并写入一个新的json文件。在执行时,它向我显示以下错误

Traceback (most recent call last):
  File "main.py", line 23, in <module>
    main()
  File "main.py", line 14, in main
    print toJson(sys.argv[2])
  File "/home/paulsteven/BEAT/apache/del6.py", line 46, in toJson
    entries = readfile(file)
  File "/home/paulsteven/BEAT/apache/del6.py", line 21, in readfile
    filecontent[index] = line2dict(line)
  File "/home/paulsteven/BEAT/apache/del6.py", line 39, in line2dict
    res = m.groupdict()
AttributeError: 'NoneType' object has no attribute 'groupdict'
输出应如下所示:

{"1": {"timestamp": "February 14 2019, 15:38:47", "monitorip": "172.217.160.132 ", "monitorhost": "www.google.com", "monitorstatus": "up", "monitorid": "tcp-tcp@ www.google.com", "resolveip": "172.217.160.132"}, "2": {"timestamp": "February 14 2019, 15:38:47", "monitorip": "104.28.4.86", "monitorhost": "www.smackcoders.com", "monitorstatus": "up", "monitorid": "tcp-tcp@ www.smackcoders.com", "resolveip": "104.28.4.86"}
以下是主要的python代码:

import sys
from del6 import *

def main():
    if len(sys.argv) < 3:
        print "Incorrect Syntax. Usage: python main.py -f <filename>"
        sys.exit(2)
    elif sys.argv[1] != "-f":
        print "Invalid switch '"+sys.argv[1]+"'"
        sys.exit(2)
    elif os.path.isfile(sys.argv[2]) == False:
        print "File does not exist"
        sys.exit(2)
    print toJson(sys.argv[2])
    text_file = open("tcp.json", "a+")
    text_file.write(toJson(sys.argv[2]))
    text_file.write("\n")
    text_file.close()



if __name__ == "__main__":
    main()
以下是我的del6代码:

import fileinput
import re
import os
try: import simplejson as json
except ImportError: import json

#read input file and return entries' Dict Object
def readfile(file):
    filecontent = {}
    index = 0
    #check necessary file size checking
    statinfo = os.stat(file)

    #just a guestimate. I believe a single entry contains atleast 150 chars
    if statinfo.st_size < 150:
        print "Not a valid access_log file. It does not have enough data"
    else:
        for line in fileinput.input(file):
            index = index+1
            if line != "\n": #don't read newlines
                filecontent[index] = line2dict(line)

        fileinput.close()
    return filecontent

#gets a line of string from Log and convert it into Dict Object
def line2dict(line):
    #Snippet, thanks to http://www.seehuhn.de/blog/52
    parts = [
    r'(?P<timestamp>\S+)',                  
    r'(?P<monitorip>\S+)',               
    r'(?P<monitorhost>\S+)',                
    r'(?P<monitorstatus>\S+)',              
    r'"(?P<monitorid>\S+)"',              
    r'(?P<resolveip>\S+)',             
]
    pattern = re.compile(r'\s+'.join(parts)+r'\s*\Z')
    m = pattern.match(line)
    res = m.groupdict()
    return res

#to get jSon of entire Log
#returns JSON object
def toJson(file):
    #get dict object for each entry
    entries = readfile(file)
    return json.JSONEncoder().encode(entries)

我看到列被双制表符分隔。基于此:

i = 1
result = {}
with open('log.txt') as f:
    lines = f.readlines()
    for line in lines:
        r = line.split('\t\t')
        result[i] = {'timestamp': r[0], 'monitorip': r[1], 'monitorhost': r[2], 'monitorstatus': r[3], 'monitorid': r[4], 'resolveip': r[5]}
        i += 1
输出:

{1: {'timestamp': 'February 14 2019, 15:38:47', 'monitorip': '172.217.160.132', 'monitorhost': 'www.google.com', 'monitorstatus': 'up', 'monitorid': 'tcp-tcp@ www.google.com', 'resolveip': '172.217.160.132\n'}, 2: {'timestamp': 'February 14 2019, 15:38:47', 'monitorip': '104.28.4.86', 'monitorhost': 'www.smackcoders.com', 'monitorstatus': 'up', 'monitorid': 'tcp-tcp@ www.smackcoders.com', 'resolveip': '104.28.4.86'}}
{'monitorid': 'tcp-tcp@ www.google.com', 'monitorstatus': 'up', 'timestamp': 'February 14 2019, 15:38:47', 'monitorhost': 'www.google.com', 'monitorip': '172.217.160.132', 'resolveip': '172.217.160.132\n'}
{'monitorid': 'tcp-tcp@ www.smackcoders.com', 'monitorstatus': 'up', 'timestamp': 'February 14 2019, 15:38:47', 'monitorhost': 'www.smackcoders.com', 'monitorip': '104.28.4.86', 'resolveip': '104.28.4.86'}
或者,如果你想要一份更自然的口述清单,那么:

result = []
with open('log.txt') as f:
    lines = f.readlines()
    for line in lines:
        r = line.split('\t\t')
        result.append({'timestamp': r[0], 'monitorip': r[1], 'monitorhost': r[2], 'monitorstatus': r[3], 'monitorid': r[4], 'resolveip': r[5]})
输出:

[{'timestamp': 'February 14 2019, 15:38:47', 'monitorip': '172.217.160.132', 'monitorhost': 'www.google.com', 'monitorstatus': 'up', 'monitorid': 'tcp-tcp@ www.google.com', 'resolveip': '172.217.160.132\n'}, {'timestamp': 'February 14 2019, 15:38:47', 'monitorip': '104.28.4.86', 'monitorhost': 'www.smackcoders.com', 'monitorstatus': 'up', 'monitorid': 'tcp-tcp@ www.smackcoders.com', 'resolveip': '104.28.4.86'}]

我看到列被双制表符分隔。基于此:

i = 1
result = {}
with open('log.txt') as f:
    lines = f.readlines()
    for line in lines:
        r = line.split('\t\t')
        result[i] = {'timestamp': r[0], 'monitorip': r[1], 'monitorhost': r[2], 'monitorstatus': r[3], 'monitorid': r[4], 'resolveip': r[5]}
        i += 1
输出:

{1: {'timestamp': 'February 14 2019, 15:38:47', 'monitorip': '172.217.160.132', 'monitorhost': 'www.google.com', 'monitorstatus': 'up', 'monitorid': 'tcp-tcp@ www.google.com', 'resolveip': '172.217.160.132\n'}, 2: {'timestamp': 'February 14 2019, 15:38:47', 'monitorip': '104.28.4.86', 'monitorhost': 'www.smackcoders.com', 'monitorstatus': 'up', 'monitorid': 'tcp-tcp@ www.smackcoders.com', 'resolveip': '104.28.4.86'}}
{'monitorid': 'tcp-tcp@ www.google.com', 'monitorstatus': 'up', 'timestamp': 'February 14 2019, 15:38:47', 'monitorhost': 'www.google.com', 'monitorip': '172.217.160.132', 'resolveip': '172.217.160.132\n'}
{'monitorid': 'tcp-tcp@ www.smackcoders.com', 'monitorstatus': 'up', 'timestamp': 'February 14 2019, 15:38:47', 'monitorhost': 'www.smackcoders.com', 'monitorip': '104.28.4.86', 'resolveip': '104.28.4.86'}
或者,如果你想要一份更自然的口述清单,那么:

result = []
with open('log.txt') as f:
    lines = f.readlines()
    for line in lines:
        r = line.split('\t\t')
        result.append({'timestamp': r[0], 'monitorip': r[1], 'monitorhost': r[2], 'monitorstatus': r[3], 'monitorid': r[4], 'resolveip': r[5]})
输出:

[{'timestamp': 'February 14 2019, 15:38:47', 'monitorip': '172.217.160.132', 'monitorhost': 'www.google.com', 'monitorstatus': 'up', 'monitorid': 'tcp-tcp@ www.google.com', 'resolveip': '172.217.160.132\n'}, {'timestamp': 'February 14 2019, 15:38:47', 'monitorip': '104.28.4.86', 'monitorhost': 'www.smackcoders.com', 'monitorstatus': 'up', 'monitorid': 'tcp-tcp@ www.smackcoders.com', 'resolveip': '104.28.4.86'}]
谢谢你的回答。 要将其保存在JSON文件中,请执行以下操作:

import json


i = 1
result = {}
with open('tcp.log') as f:
    lines = f.readlines()
    for line in lines:
        r = line.split('\t\t')
        result[i] = {'timestamp': r[0], 'monitorip': r[1], 'monitorhost': r[2], 'monitorstatus': r[3], 'monitorid': r[4], 'resolveip': r[5]}
        i += 1 
print(result) 
with open('data.json', 'w') as fp:
    json.dump(result, fp)
谢谢你的回答。 要将其保存在JSON文件中,请执行以下操作:

import json


i = 1
result = {}
with open('tcp.log') as f:
    lines = f.readlines()
    for line in lines:
        r = line.split('\t\t')
        result[i] = {'timestamp': r[0], 'monitorip': r[1], 'monitorhost': r[2], 'monitorstatus': r[3], 'monitorid': r[4], 'resolveip': r[5]}
        i += 1 
print(result) 
with open('data.json', 'w') as fp:
    json.dump(result, fp)

下面是解决此问题的一般方法。函数“log\u lines\u to\u json”将处理任何文本文件,其中字段由“field\u delimiter”分隔,字段名为“field\u names”

FIELD_NAMES = ['timestamp', 'monitorip', 'monitorhost', 'monitorstatus', 'monitorid', 'resolveip']
FIELD_DELIMITER = '\t\t'


def log_lines_to_json(log_file, field_names, field_delimiter):
    result = []
    with open(log_file) as f:
        lines = f.readlines()
        for line in lines:
            fields = line.split(field_delimiter)
            result.append({field_name: fields[idx] for idx, field_name in enumerate(field_names)})
    return result


entries = log_lines_to_json('log.txt', FIELD_NAMES, FIELD_DELIMITER)
for entry in entries:
    print(entry)
输出:

{1: {'timestamp': 'February 14 2019, 15:38:47', 'monitorip': '172.217.160.132', 'monitorhost': 'www.google.com', 'monitorstatus': 'up', 'monitorid': 'tcp-tcp@ www.google.com', 'resolveip': '172.217.160.132\n'}, 2: {'timestamp': 'February 14 2019, 15:38:47', 'monitorip': '104.28.4.86', 'monitorhost': 'www.smackcoders.com', 'monitorstatus': 'up', 'monitorid': 'tcp-tcp@ www.smackcoders.com', 'resolveip': '104.28.4.86'}}
{'monitorid': 'tcp-tcp@ www.google.com', 'monitorstatus': 'up', 'timestamp': 'February 14 2019, 15:38:47', 'monitorhost': 'www.google.com', 'monitorip': '172.217.160.132', 'resolveip': '172.217.160.132\n'}
{'monitorid': 'tcp-tcp@ www.smackcoders.com', 'monitorstatus': 'up', 'timestamp': 'February 14 2019, 15:38:47', 'monitorhost': 'www.smackcoders.com', 'monitorip': '104.28.4.86', 'resolveip': '104.28.4.86'}

下面是解决此问题的一般方法。函数“log\u lines\u to\u json”将处理任何文本文件,其中字段由“field\u delimiter”分隔,字段名为“field\u names”

FIELD_NAMES = ['timestamp', 'monitorip', 'monitorhost', 'monitorstatus', 'monitorid', 'resolveip']
FIELD_DELIMITER = '\t\t'


def log_lines_to_json(log_file, field_names, field_delimiter):
    result = []
    with open(log_file) as f:
        lines = f.readlines()
        for line in lines:
            fields = line.split(field_delimiter)
            result.append({field_name: fields[idx] for idx, field_name in enumerate(field_names)})
    return result


entries = log_lines_to_json('log.txt', FIELD_NAMES, FIELD_DELIMITER)
for entry in entries:
    print(entry)
输出:

{1: {'timestamp': 'February 14 2019, 15:38:47', 'monitorip': '172.217.160.132', 'monitorhost': 'www.google.com', 'monitorstatus': 'up', 'monitorid': 'tcp-tcp@ www.google.com', 'resolveip': '172.217.160.132\n'}, 2: {'timestamp': 'February 14 2019, 15:38:47', 'monitorip': '104.28.4.86', 'monitorhost': 'www.smackcoders.com', 'monitorstatus': 'up', 'monitorid': 'tcp-tcp@ www.smackcoders.com', 'resolveip': '104.28.4.86'}}
{'monitorid': 'tcp-tcp@ www.google.com', 'monitorstatus': 'up', 'timestamp': 'February 14 2019, 15:38:47', 'monitorhost': 'www.google.com', 'monitorip': '172.217.160.132', 'resolveip': '172.217.160.132\n'}
{'monitorid': 'tcp-tcp@ www.smackcoders.com', 'monitorstatus': 'up', 'timestamp': 'February 14 2019, 15:38:47', 'monitorhost': 'www.smackcoders.com', 'monitorip': '104.28.4.86', 'resolveip': '104.28.4.86'}

显示您想要的输出格式示例。显示您想要的输出格式示例。