Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/330.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 分组依据和其他列值的计数_Python_Pandas - Fatal编程技术网

Python 分组依据和其他列值的计数

Python 分组依据和其他列值的计数,python,pandas,Python,Pandas,我有一个熊猫数据框 age gender criticality acknowledged 10 Male High Yes 10 Male High Yes 10 Male High Yes 10 Male Low Yes 11 Female Medium No

我有一个熊猫数据框

age   gender   criticality    acknowledged       
 10    Male       High            Yes
 10    Male       High            Yes
 10    Male       High            Yes
 10    Male       Low             Yes
 11    Female     Medium          No
我想按年龄和性别分组,然后将“关键性”、“确认”的值作为新列并获取计数

对于eg输出,我希望:

                 criticality          acknowledged
age  gender    High   Medium   Low     Yes    No
 10    Male    3       0       1        4     0
 11    Female  0       1       0        0     1 
我想到了使用
df.groupby(['age','gender'])['criticality','conferenced'].stack()

但它不起作用


是否有更好的方法获得此格式的结果

因为您要分别计算两列,concat将是一个简单的解决方案:

[13]中的
:pd.concat([df.pivot_table(index=['age','gender'],columns=col,aggfunc
…:=len)表示[‘临界’、‘已确认’]中的列,轴=1)。fillna(0)
出[13]:
公认的临界性
临界性高-低-中-否-是
年龄性别
10男3.0 1.0 0.0 0.0 4.0
11名女性0.0.0 1.0 1.0 0 0.0
另一种方法是在后面使用with
groupby()
并最终使用
expand=True
拆分多索引列:

l=['criticality','acknowledged']
final=df[['age','gender']].assign(**pd.get_dummies(df[l])).groupby(['age','gender']).sum()
final.columns=final.columns.str.split('_',expand=True)
print(final)

                     criticality       acknowledged    
                   High Low Medium           No Yes
age gender                                        
10  Male             3   1      0            0   4
11  Female           0   0      1            1   0