Pandas groubby month并将其转换为json

Pandas groubby month并将其转换为json,pandas,dataframe,Pandas,Dataframe,我有如下所示的数据框对象 index Date Poly_1 Poly_2 Poly_2_WLS Poly_3 0 0 2017-01-04 08:45:00 70.195597 83.613845 83.613845 99.041125 1 1 2017-01-04 08:53:00 70.195597 83.613845 83.613845 99.041125 2

我有如下所示的数据框对象

   index                Date     Poly_1     Poly_2  Poly_2_WLS     Poly_3  
0      0 2017-01-04 08:45:00  70.195597  83.613845   83.613845  99.041125   
1      1 2017-01-04 08:53:00  70.195597  83.613845   83.613845  99.041125   
2      2 2017-01-04 09:00:00  70.195597  83.613845   83.613845  99.041125   
3      3 2017-12-13 08:45:00  70.195597  83.613845   83.613845  99.041125   
4      4 2017-12-13 08:53:00  70.195597  83.613845   83.613845  99.041125  
我使用下面的代码按月对上述数据进行分组

dfgrp=df.groupby(pd.Grouper(key='Date',freq="M"),as_index=False)
稍后,我想使用下面的代码将分组数据转换为json格式

dfgrp.to_json(date_format='iso',orient='records')
然而,由于某些原因,我得到下面的错误

AttributeError:无法访问“DataFrameGroupBy”对象的可调用属性“to_json”,请尝试使用“apply”方法

请让我知道,如何将上述数据帧转换为json

编辑:

{
    ,
    "Poly_1": {
        "Jan": 46187.2636499188,
        "Feb": 56636.9594359758,
        "Mar": 53218.6089763865,
        "Apr": 41100.9574106447,
        "May": 49317.907305443,
        "Jun": 2670.6255284702,
        "July": 34887.4415455112,
        "Aug": 45857.8601621408,
        "Sept": 21635.3343188418
    }


}
我使用了下面答案中的代码行;并且能够生成JSON

df.groupby([df.Date.dt.month)])['Poly_1','Poly_2','Poly_2_WLS','Poly_3'].sum().reset_index().to_json()
但是JSON是以这种格式生成的

`{"Date":{"0":1,"1":2,"2":3,"3":4,"4":5,"5":9,"6":10,"7":11,"8":12},"Poly_1":{"0":46187.2636499188,"1":56636.9594359758,"2":53218.6089763865,"3":41100.9574106447,"4":49317.907305443,"5":2670.6255284702,"6":34887.4415455112,"7":45857.8601621408,"8":21635.3343188418},"Poly_2":{"0":46193.719351124,"1":56193.0159455145,"2":52890.1916931438,"3":41119.1740551722,"4":49648.1531559606,"5":2767.3530477022,"6":34704.8815525262,"7":45918.9353954344,"8":22077.5341367508},"Poly_2_WLS":{"0":46193.719351124,"1":56193.0159455145,"2":52890.1916931438,"3":41119.1740551722,"4":49648.1531559606,"5":2767.3530477022,"6":34704.8815525262,"7":45918.9353954344,"8":22077.5341367508},"Poly_3":{"0":46037.6280724075,"1":56111.2211081627,"2":53059.8469394733,"3":41282.9093221716,"4":49670.016727901,"5":2660.8721082338,"6":34724.1756869611,"7":45721.7694774285,"8":22244.5188905397}`}
然而,我想要JSON,就像下面的格式一样

预期的Json格式示例:

{
    ,
    "Poly_1": {
        "Jan": 46187.2636499188,
        "Feb": 56636.9594359758,
        "Mar": 53218.6089763865,
        "Apr": 41100.9574106447,
        "May": 49317.907305443,
        "Jun": 2670.6255284702,
        "July": 34887.4415455112,
        "Aug": 45857.8601621408,
        "Sept": 21635.3343188418
    }


}
请建议我如何获得上述预期格式


谢谢

在我的回答中,我假设您希望对所有多边形求和(否则,只需更改下面的代码)

我会这样分组,将7月分为(7):

注意:我添加了sum,reset_index,然后才添加到_json

您的分组df如下所示:

    Date    Poly_1  Poly_2      Poly_2_WLS  Poly_3
0   7   350.977985  418.069225  418.069225  495.205625
您的json将是(如果不重置索引)

另一个指针:如果你使用石斑鱼,你将得到2019-07-31,你的json将如下所示(156453120000)


因此,调整你的需要

你希望你的
json
文件看起来像什么?你需要
df.groupby(df.Date.dt.month).sum()到\u json(Date\u format='iso',orient='records')
?如果不是,您打算如何聚合这些组?您需要在您的groupby之后调用聚合函数,如
sum
mean
您的答案有帮助,但我得到的是不同的JSON格式。{“日期:{“0”:1,“1”:2,“2”:3”:4,“4”:5,“5”:9,“6”:10,“7”:11,“8”:12},“Poly_1:{“0”:46187.2636499188,“1”:56636.9594359758,“2”:53218.6089763865,“3”:41100.957410644447,“4”:49317.907305443,“5”:2670.6255284702,“6”:34887.44554112,“7”:45857.8601621408,“8”:21635.33418通过添加样本数据更新了问题格式,请看一看。@niran给我更多的输入:(1)您使用的python版本是什么(2)您使用了重置索引(3)您使用了Grouper吗?(1)我使用的是py.2.7(2)是的,我使用的是重置索引,正是您的代码。(3) 不,我没有使用Grouper。只是使用了我编辑的问题中提到的groupby。我不确定2.7是否是您的必备要求。如果不是的话,我强烈建议你转到3.6。除了巨大的好处之外,在不久的将来,2.7将不会得到支持。不管怎样,我对您的回答是基于3.6的,如果您想知道为什么会得到不同的值,请参见:我无法更改python的版本:),即使使用循环也可以!而不是使用json方法
'{"Poly_1":{"7":350.977985},"Poly_2":{"7":418.069225},"Poly_2_WLS":{"7":418.069225},"Poly_3":{"7":495.205625}}'
df.groupby([pd.Grouper(key='Date',freq="M")])['Poly_1','Poly_2','Poly_2_WLS','Poly_3'].sum().to_json()

'{"Poly_1":{"1564531200000":350.977985},"Poly_2":{"1564531200000":418.069225},"Poly_2_WLS":{"1564531200000":418.069225},"Poly_3":{"1564531200000":495.205625}}'