Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/json/13.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/5/url/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
使用PowerShell将文本文件转换为JSON_Json_Amazon Web Services_Powershell_Amazon S3 - Fatal编程技术网

使用PowerShell将文本文件转换为JSON

使用PowerShell将文本文件转换为JSON,json,amazon-web-services,powershell,amazon-s3,Json,Amazon Web Services,Powershell,Amazon S3,我在AWS S3存储中工作,在那里我们有存储桶,文件被添加到存储桶中。存储桶信息以文本格式记录到另一个存储桶中 我想将存储在文本文件中的日志信息转换为JSON。但是,文件中没有密钥对信息 日志文件的内容如下:- FD89D80D676948BD913040B667965EF6A50A9C80A12F38C504F49953AEDC341 S3样本桶[10/Mar/2021:03:27:29+0000]171.60.235.108 FD89D80D676948BD9130B667965EF6A50

我在AWS S3存储中工作,在那里我们有存储桶,文件被添加到存储桶中。存储桶信息以文本格式记录到另一个存储桶中

我想将存储在文本文件中的日志信息转换为JSON。但是,文件中没有密钥对信息

日志文件的内容如下:-

FD89D80D676948BD913040B667965EF6A50A9C80A12F38C504F49953AEDC341 S3样本桶[10/Mar/2021:03:27:29+0000]171.60.235.108 FD89D80D676948BD9130B667965EF6A50A9C80A12F38C504F49953AEDC341 MX1XP335Q5YFS06H REST.HEAD.BUCKET—“头/样本桶HTTP/1.1“200---13”“S3Console/0.4,aws internal/3 aws sdk java/1.11.964 Linux/4.9.230-0.1.ac.224.84.332.metal1.x86_64 OpenJDK_64位_服务器_VM/25.282-b08 java/1.8.0_282供应商/甲骨文公司”-AMNo4/b/T+5JDEVQPLKQLKQZ0SV8VDXYD3ODEFMK+5LVANUZGIXW2LV87OBL5TBSZ/JW5ZFSA=SIGDHE-AUTHE-128-RSA-128-SHATL28-SVTL2.2代码”

日志文件的各个值如下:-
日志字段

铲斗所有者:fd89d80d676948bd913040b667965ef6a50a9c80a12f38c504f497953aedc341
桶:S3SampleBucket
时间:[11/Mar/2021:*06:*52:*33+0000]
远程IP:183.87.60.172
请求者:arn:*aws:*iam:*:**486031527132:*用户/jdoe
请求ID:9YQ1MWABKNRPX3MP
操作:REST.GET.LOCATION
键:-(空白)
请求URI:“获取/?位置HTTP/1.1”
HTTP状态:200
错误代码:-(空白) 发送的字节数:137
对象大小:-(空白)
总时间:17
周转时间:-(空白)
参考:“-”(空白)
用户代理:“AWSPowerShell/4.1.9.0.NET\u运行时/4.0.NET\u框架/4.0操作系统/Microsoft\u Windows\u NT\u 10.0.18363.0 Windows PowerShell/5.0客户端同步”
版本Id:-(空白)
主机Id:Q5WBXJNRWSPFMTOG+d2YN0xAtvbq1sdqm9vh6AflXdMCmny5VC3bZmyTBZavKGpO3J/uz+IfK0=
签名版本:SigV4
密码套件:ECDHE-RSA-AES128-GCM-SHA256
身份验证类型:AuthHeader
主机标题:S3SampleBucket.s3.us-west-2.amazonaws.com
TLS版本:TLSv1.2

我可以在配置文件中添加值,这是我所能想到的。我想在PowerShell或Python中这样做


任何帮助都会大有帮助。

日志格式可以解释为CSV(带有空格分隔符),因此您可以使用导入CSV
/
从CSV转换来解析它:

$columns = 'Bucket Owner', 'Bucket', 'Time', 'Remote IP', 'Requester', 'Request ID', 'Operation', 'Key', 'Request-URI', 'HTTP status', 'Error Code', 'Bytes Sent', 'Object Size', 'Total Time', 'Turn-Around Time', 'Referer', 'User-Agen', 'Version Id', 'Host Id', 'Signature Version', 'Cipher Suite', 'Authentication Type', 'Host Header', 'TLS version'

$data = @'
fd89d80d676948bd913040b667965ef6a50a9c80a12f38c504f497953aedc341 s3Samplebucket [10/Mar/2021:03:27:29 +0000] 171.60.235.108 fd89d80d676948bd913040b667965ef6a50a9c80a12f38c504f497953aedc341 MX1XP335Q5YFS06H REST.HEAD.BUCKET - "HEAD /s3Samplebucket HTTP/1.1" 200 - - - 13 13 "-" "S3Console/0.4, aws-internal/3 aws-sdk-java/1.11.964 Linux/4.9.230-0.1.ac.224.84.332.metal1.x86_64 OpenJDK_64-Bit_Server_VM/25.282-b08 java/1.8.0_282 vendor/Oracle_Corporation" - AMNo4/b/T+5JdEVQpLkqz0SV8VDXyd3odEFmK+5LvanuzgIXW2Lv87OBl5r5tbSZ/yjW5zfFQsA= SigV4 ECDHE-RSA-AES128-GCM-SHA256 AuthHeader s3-us-west-2.amazonaws.com TLSv1.2
'@

$parsedLog = $data |ConvertFrom-Csv -Delimiter ' ' -Header $columns
$parsedLog = Import-Csv -Path .\path\to\s3requests.log -Delimiter ' ' -Header $columns
现在,生成的对象很容易转换为JSON:

PS ~> $parsedLog |ConvertTo-Json
{
    "Bucket Owner":  "fd89d80d676948bd913040b667965ef6a50a9c80a12f38c504f497953aedc341",
    "Bucket":  "s3Samplebucket",
    "Time":  "[10/Mar/2021:03:27:29",
    "Remote IP":  "+0000]",
    "Requester":  "171.60.235.108",
    "Request ID":  "fd89d80d676948bd913040b667965ef6a50a9c80a12f38c504f497953aedc341",
    "Operation":  "MX1XP335Q5YFS06H",
    "Key":  "REST.HEAD.BUCKET",
    "Request-URI":  "-",
    "HTTP status":  "HEAD /s3Samplebucket HTTP/1.1",
    "Error Code":  "200",
    "Bytes Sent":  "-",
    "Object Size":  "-",
    "Total Time":  "-",
    "Turn-Around Time":  "13",
    "Referer":  "13",
    "User-Agen":  "-",
    "Version Id":  "S3Console/0.4, aws-internal/3 aws-sdk-java/1.11.964 Linux/4.9.230-0.1.ac.224.84.332.metal1.x86_64 OpenJDK_64-Bit_Server_VM/25.282-b08 java/1.8.0_282 vendor/Oracle_Corporation",
    "Host Id":  "-",
    "Signature Version":  "AMNo4/b/T+5JdEVQpLkqz0SV8VDXyd3odEFmK+5LvanuzgIXW2Lv87OBl5r5tbSZ/yjW5zfFQsA=",
    "Cipher Suite":  "SigV4",
    "Authentication Type":  "ECDHE-RSA-AES128-GCM-SHA256",
    "Host Header":  "AuthHeader",
    "TLS version":  "s3-us-west-2.amazonaws.com"
}

在您的情况下,要从磁盘读取文件,只需将
$data=…
$data | ConvertFrom Csv
语句替换为
导入Csv

$columns = 'Bucket Owner', 'Bucket', 'Time', 'Remote IP', 'Requester', 'Request ID', 'Operation', 'Key', 'Request-URI', 'HTTP status', 'Error Code', 'Bytes Sent', 'Object Size', 'Total Time', 'Turn-Around Time', 'Referer', 'User-Agen', 'Version Id', 'Host Id', 'Signature Version', 'Cipher Suite', 'Authentication Type', 'Host Header', 'TLS version'

$data = @'
fd89d80d676948bd913040b667965ef6a50a9c80a12f38c504f497953aedc341 s3Samplebucket [10/Mar/2021:03:27:29 +0000] 171.60.235.108 fd89d80d676948bd913040b667965ef6a50a9c80a12f38c504f497953aedc341 MX1XP335Q5YFS06H REST.HEAD.BUCKET - "HEAD /s3Samplebucket HTTP/1.1" 200 - - - 13 13 "-" "S3Console/0.4, aws-internal/3 aws-sdk-java/1.11.964 Linux/4.9.230-0.1.ac.224.84.332.metal1.x86_64 OpenJDK_64-Bit_Server_VM/25.282-b08 java/1.8.0_282 vendor/Oracle_Corporation" - AMNo4/b/T+5JdEVQpLkqz0SV8VDXyd3odEFmK+5LvanuzgIXW2Lv87OBl5r5tbSZ/yjW5zfFQsA= SigV4 ECDHE-RSA-AES128-GCM-SHA256 AuthHeader s3-us-west-2.amazonaws.com TLSv1.2
'@

$parsedLog = $data |ConvertFrom-Csv -Delimiter ' ' -Header $columns
$parsedLog = Import-Csv -Path .\path\to\s3requests.log -Delimiter ' ' -Header $columns

日志格式可以解释为CSV(带有空格分隔符),因此您可以使用
Import CSV
/
ConvertFrom CSV
对其进行解析:

$columns = 'Bucket Owner', 'Bucket', 'Time', 'Remote IP', 'Requester', 'Request ID', 'Operation', 'Key', 'Request-URI', 'HTTP status', 'Error Code', 'Bytes Sent', 'Object Size', 'Total Time', 'Turn-Around Time', 'Referer', 'User-Agen', 'Version Id', 'Host Id', 'Signature Version', 'Cipher Suite', 'Authentication Type', 'Host Header', 'TLS version'

$data = @'
fd89d80d676948bd913040b667965ef6a50a9c80a12f38c504f497953aedc341 s3Samplebucket [10/Mar/2021:03:27:29 +0000] 171.60.235.108 fd89d80d676948bd913040b667965ef6a50a9c80a12f38c504f497953aedc341 MX1XP335Q5YFS06H REST.HEAD.BUCKET - "HEAD /s3Samplebucket HTTP/1.1" 200 - - - 13 13 "-" "S3Console/0.4, aws-internal/3 aws-sdk-java/1.11.964 Linux/4.9.230-0.1.ac.224.84.332.metal1.x86_64 OpenJDK_64-Bit_Server_VM/25.282-b08 java/1.8.0_282 vendor/Oracle_Corporation" - AMNo4/b/T+5JdEVQpLkqz0SV8VDXyd3odEFmK+5LvanuzgIXW2Lv87OBl5r5tbSZ/yjW5zfFQsA= SigV4 ECDHE-RSA-AES128-GCM-SHA256 AuthHeader s3-us-west-2.amazonaws.com TLSv1.2
'@

$parsedLog = $data |ConvertFrom-Csv -Delimiter ' ' -Header $columns
$parsedLog = Import-Csv -Path .\path\to\s3requests.log -Delimiter ' ' -Header $columns
现在,生成的对象很容易转换为JSON:

PS ~> $parsedLog |ConvertTo-Json
{
    "Bucket Owner":  "fd89d80d676948bd913040b667965ef6a50a9c80a12f38c504f497953aedc341",
    "Bucket":  "s3Samplebucket",
    "Time":  "[10/Mar/2021:03:27:29",
    "Remote IP":  "+0000]",
    "Requester":  "171.60.235.108",
    "Request ID":  "fd89d80d676948bd913040b667965ef6a50a9c80a12f38c504f497953aedc341",
    "Operation":  "MX1XP335Q5YFS06H",
    "Key":  "REST.HEAD.BUCKET",
    "Request-URI":  "-",
    "HTTP status":  "HEAD /s3Samplebucket HTTP/1.1",
    "Error Code":  "200",
    "Bytes Sent":  "-",
    "Object Size":  "-",
    "Total Time":  "-",
    "Turn-Around Time":  "13",
    "Referer":  "13",
    "User-Agen":  "-",
    "Version Id":  "S3Console/0.4, aws-internal/3 aws-sdk-java/1.11.964 Linux/4.9.230-0.1.ac.224.84.332.metal1.x86_64 OpenJDK_64-Bit_Server_VM/25.282-b08 java/1.8.0_282 vendor/Oracle_Corporation",
    "Host Id":  "-",
    "Signature Version":  "AMNo4/b/T+5JdEVQpLkqz0SV8VDXyd3odEFmK+5LvanuzgIXW2Lv87OBl5r5tbSZ/yjW5zfFQsA=",
    "Cipher Suite":  "SigV4",
    "Authentication Type":  "ECDHE-RSA-AES128-GCM-SHA256",
    "Host Header":  "AuthHeader",
    "TLS version":  "s3-us-west-2.amazonaws.com"
}

在您的情况下,要从磁盘读取文件,只需将
$data=…
$data | ConvertFrom Csv
语句替换为
导入Csv

$columns = 'Bucket Owner', 'Bucket', 'Time', 'Remote IP', 'Requester', 'Request ID', 'Operation', 'Key', 'Request-URI', 'HTTP status', 'Error Code', 'Bytes Sent', 'Object Size', 'Total Time', 'Turn-Around Time', 'Referer', 'User-Agen', 'Version Id', 'Host Id', 'Signature Version', 'Cipher Suite', 'Authentication Type', 'Host Header', 'TLS version'

$data = @'
fd89d80d676948bd913040b667965ef6a50a9c80a12f38c504f497953aedc341 s3Samplebucket [10/Mar/2021:03:27:29 +0000] 171.60.235.108 fd89d80d676948bd913040b667965ef6a50a9c80a12f38c504f497953aedc341 MX1XP335Q5YFS06H REST.HEAD.BUCKET - "HEAD /s3Samplebucket HTTP/1.1" 200 - - - 13 13 "-" "S3Console/0.4, aws-internal/3 aws-sdk-java/1.11.964 Linux/4.9.230-0.1.ac.224.84.332.metal1.x86_64 OpenJDK_64-Bit_Server_VM/25.282-b08 java/1.8.0_282 vendor/Oracle_Corporation" - AMNo4/b/T+5JdEVQpLkqz0SV8VDXyd3odEFmK+5LvanuzgIXW2Lv87OBl5r5tbSZ/yjW5zfFQsA= SigV4 ECDHE-RSA-AES128-GCM-SHA256 AuthHeader s3-us-west-2.amazonaws.com TLSv1.2
'@

$parsedLog = $data |ConvertFrom-Csv -Delimiter ' ' -Header $columns
$parsedLog = Import-Csv -Path .\path\to\s3requests.log -Delimiter ' ' -Header $columns

我们不能使用空格“”,因为基本上属于同一组的数据之间有空格…例如
[10/Mar/2021:03:27:29+0000]171.60.235.108
应该是“时间”:“[10/Mar/2021:03:27:29+0000],
“远程IP”:“171.60.235.108”,
**根据您的代码,它是**“时间”:“[10/Mar 2021:03:27:29”,“远程IP”:“+0000]”,“请求者”:“171.60.235.108”@NottyHead哦,我明白了。
(Get Content file.log)-替换“\[([^\]+)\]”,“$1”|从Csv转换…
应该执行trickAnother帮助请求@user:712649在上述示例中,如何转换存储在[10/Mar/2021:03:27:29+0000]中的时间至MM dd YYYYTH:MM:ssZ格式。我询问的原因是我们需要在下游系统中使用此数据,其中数据存储为03-10-2021T03:27:29Z@NottyHead此网站上已有100篇帖子回答了这个问题。@712649$parsedLog=$data-替换“\[([^\]]+)\]”,“$1”“'| ConvertFrom Csv-分隔符”“-标题$columns$date_format=“yyyy-MM-ddTHH:MM:ssZ”$parsedLog | Select([datetime]::ParseExact($.'Time',“dd/MMM/yyyy:HH:MM:ss+0000”,$null)。ToSTring($date_format))**错误:**调用带有“3”参数的“ParseExact”时出现异常:““字符串未被识别为有效的日期时间。”在第18行,char:1+$parsedLog | Select([datetime]::ParseExact($.'Time'),“dd/MMM/yyyyy:HH…+CategoryInfo:NotSpecified:(:)[],MethodInvocationException+FullyQualifiedErrorId:FormatException我们不能使用空格“”,因为数据之间存在空格,而这些空格本质上是同一集合的一部分…例如
[10/Mar/2021:03:27:29+0000]171.60.235.108
应该是“时间”:“[10/Mar/2021:03:27:29+0000],
“远程IP”:“171.60.235.108”,
**根据您的代码,它是**“时间”:“[10/Mar/2021:03:27:29”,“远程IP”:“+0000],“请求者”:“171.60.235.108”,@NottyHead Oh,我明白了。
(获取内容文件.log)-替换“+”,[^\”,[\]$1”“| ConvertFrom Csv…
应执行Trickan其他帮助请求@user:712649在上述示例中,如何转换存储在[10/Mar/2021:03:27:29+0000]中的时间?”至MM dd YYYYTH:MM:ssZ格式。我询问的原因是我们需要在下游系统中使用此数据,其中数据存储为03-10-2021T03:27:29Z@NottyHead此网站上已有100篇帖子回答了这个问题。@712649$parsedLog=$data-替换“\[([^\]]+)\]”,“$1”“| ConvertFrom Csv-分隔符”-标题$columns$date_格式