Warning: file_get_contents(/data/phpspider/zhask/data//catemap/5/actionscript-3/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/1/asp.net/36.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scikit learn 如何更改隔离林的输出数据格式_Scikit Learn_Isolation - Fatal编程技术网

Scikit learn 如何更改隔离林的输出数据格式

Scikit learn 如何更改隔离林的输出数据格式,scikit-learn,isolation,Scikit Learn,Isolation,我已经建立了一个隔离林来检测csv文件的异常情况,我想看看如何更改数据的格式。现在,异常数据被输出为pandas数据帧,但我想将其更改为json文件,格式如下: {seconds: #seconds for that row, size2: size2, pages: #pages for that row} 我已经附上来的代码和数据的样本,非常感谢 model.fit(df[['label']]) df['anomaly']=model.fit_predict(df[['size2','si

我已经建立了一个隔离林来检测csv文件的异常情况,我想看看如何更改数据的格式。现在,异常数据被输出为pandas数据帧,但我想将其更改为json文件,格式如下:

{seconds: #seconds for that row, size2: size2, pages: #pages for that row}
我已经附上来的代码和数据的样本,非常感谢

model.fit(df[['label']])
df['anomaly']=model.fit_predict(df[['size2','size3','size4']])
#df['anomaly']= model.predict(df[['pages']])
print(model.predict(X_test))
anomaly = df.loc[df['anomaly']==-1]
anomaly_index = list(anomaly.index)
print(anomaly)

输出数据如下所示:

Unnamed:  seconds:    size2: ... size4: pages:  anomaly:
1          40            32       654     1       -1

我已经想出了一个方法来做到这一点;我制作了多个字典,一个将行的索引映射到该时间戳,另一个将行的索引映射到标签。然后,我能够跟踪输出数据中的索引,并访问这些字典中的所有信息