Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/309.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 将多索引数据帧转换为JSON_Python_Pandas_Pandas Groupby_Multi Index - Fatal编程技术网

Python 将多索引数据帧转换为JSON

Python 将多索引数据帧转换为JSON,python,pandas,pandas-groupby,multi-index,Python,Pandas,Pandas Groupby,Multi Index,考虑具有多索引的数据帧: 上述数据帧需要转换为json,如下所示: bodyContent": [ { "time": "31/03/2020 02:17:01", "tag_5764_virtual_device_135": -0.97 }, { "time": "31/03/2020 02:17:12", "tag_5764_virtual_device_135":

考虑具有多索引的数据帧:

上述数据帧需要转换为json,如下所示:

bodyContent": [
        {
          "time": "31/03/2020 02:17:01",
          "tag_5764_virtual_device_135": -0.97
        },
        {
          "time": "31/03/2020 02:17:12",
          "tag_5764_virtual_device_135": -0.97
        },
        {
          "time": "31/03/2020 02:17:22",
          "tag_5764_virtual_device_135": -0.97
        },
        {
          "time": "31/03/2020 02:18:37",
          "tag_5764_virtual_device_136": -0.98
        },
        {
          "time": "31/03/2020 02:18:47",
          "tag_5764_virtual_device_136": -0.98
        },
        {
          "time": "31/03/2020 02:18:57",
          "tag_5764_virtual_device_136": -0.98
        }
]
目前,我正在拆分DF,然后重命名列,然后合并它,然后转换为json

有没有更好的方法让我使用熊猫


感谢您的帮助

我发现可以按如下方式进行:

如果数据帧是df:

df.columns=[''.'.joincol表示df.columns中的列] df.reset_indexinplace=True df_list=json.loadsdf.to_jsonorient='records' 对于df_列表中的每个: 正文内容列表。每个附件 希望这对某人有用

df.columns=['.'.joincol[::-1]表示df.columns中的列] df=df.reset_index.renamecolumns={'timestamp':'time'} jsonbody=list{k:{k1:v1表示k1,v1表示v.items,如果pd.notnullv1}\ 对于k,df.to_dictorient='index'.items}.value中的v
bodyContent": [
        {
          "time": "31/03/2020 02:17:01",
          "tag_5764_virtual_device_135": -0.97
        },
        {
          "time": "31/03/2020 02:17:12",
          "tag_5764_virtual_device_135": -0.97
        },
        {
          "time": "31/03/2020 02:17:22",
          "tag_5764_virtual_device_135": -0.97
        },
        {
          "time": "31/03/2020 02:18:37",
          "tag_5764_virtual_device_136": -0.98
        },
        {
          "time": "31/03/2020 02:18:47",
          "tag_5764_virtual_device_136": -0.98
        },
        {
          "time": "31/03/2020 02:18:57",
          "tag_5764_virtual_device_136": -0.98
        }
]