Python 如何将数据帧结果转换为用户定义的json格式

Python 如何将数据帧结果转换为用户定义的json格式,python,json,pandas,dataframe,Python,Json,Pandas,Dataframe,我想将数据帧(df)的结果转换为我在下面提到的用户定义的json格式。打印结果(df)后,我将在表单中获得结果 data_df = pandas.read_csv('details.csv') data_df = data_df.replace('Null', np.nan) df = data_df.groupby(['country', 'branch']).count() df = df.drop('sales', axis=1) df = df.reset_index() print

我想将数据帧(df)的结果转换为我在下面提到的用户定义的json格式。打印结果(df)后,我将在表单中获得结果

data_df = pandas.read_csv('details.csv')
data_df = data_df.replace('Null', np.nan)
df = data_df.groupby(['country', 'branch']).count()
df = df.drop('sales', axis=1)  
df = df.reset_index()
print df
我想将其转换为Json

country     branch      no_of_employee     total_salary    count_DOB   count_email
  x            a            30                 2500000        20            25
  x            b            20                 350000         15            20
  y            c            30                 4500000        30            30
  z            d            40                 5500000        40            40
  z            e            10                 1000000        10            10
  z            f            15                 1500000        15            15
请注意,我不希望数据框中的所有字段都使用Json(例如:count_DOB)

您可以使用,并且最后:


我尝试打印g.to_dict(),但JSON无效(请检查)。

在运行此代码时,我得到了TypeError:to_dict()得到了一个意外的关键字参数“orient”,它无法处理此示例和您的真实数据?请使用此
df
-
df=pd.DataFrame({'count\u email'):{0:25,1:20,2:30,3:40,4:10,5:15},{0:x',1:x',2:y',3:z',4:z',5:z',count_DOB':{0:20,1:15,2:30,3:40,4:10,5:15},{0:a',1:b',2:c',3:d',4:e',5:f',工资总额:{0:2500000,1:350000,2:4500000,3:40000,3:1000000,雇员人数:1000000{0: 30, 1: 20, 2: 30, 3: 40, 4: 10, 5: 15}})
。还是相同的错误吗?如果是的话,熊猫的哪个版本使用打印pd.\uuuu版本\uuuuu?现在一切都很好。我所做的是卸载熊猫并安装另一个版本。现在,一切都很好。非常感谢。我当前的熊猫版本是0.17.1。我尝试了json Lint,json是有效的
x
   {
      a
        {
              no.of employees:30
              total salary:2500000
              count_email:25
         }
       b
         {
              no.of employees:20
              total salary:350000
              count_email:25

           }
     }

   y
     {

        c
         {
              no.of employees:30
              total salary:4500000
              count_email:30

           }
      }
   z
     {
       d
         {
              no.of employees:40
              total salary:550000
              count_email:40
         }
       e
         {
              no.of employees:10
              total salary:100000
              count_email:15

         }
        f
         {
              no.of employees:15
              total salary:1500000
              count_email:15

         }
    }
  country branch  no_of_employee  total_salary  count_DOB  count_email
0       x      a              30       2500000         20           25
1       x      b              20        350000         15           20
2       y      c              30       4500000         30           30
3       z      d              40       5500000         40           40
4       z      e              10       1000000         10           10
5       z      f              15       1500000         15           15

g = df.groupby('country')[["branch", "no_of_employee","total_salary", "count_email"]]
                              .apply(lambda x: x.set_index('branch').to_dict(orient='index'))
print g.to_json()
{
    "x": {
        "a": {
            "total_salary": 2500000,
            "no_of_employee": 30,
            "count_email": 25
        },
        "b": {
            "total_salary": 350000,
            "no_of_employee": 20,
            "count_email": 20
        }
    },
    "y": {
        "c": {
            "total_salary": 4500000,
            "no_of_employee": 30,
            "count_email": 30
        }
    },
    "z": {
        "e": {
            "total_salary": 1000000,
            "no_of_employee": 10,
            "count_email": 10
        },
        "d": {
            "total_salary": 5500000,
            "no_of_employee": 40,
            "count_email": 40
        },
        "f": {
            "total_salary": 1500000,
            "no_of_employee": 15,
            "count_email": 15
        }
    }
}