Pandas ValueError:Grouper for';C';非一维
使用此数据帧Pandas ValueError:Grouper for';C';非一维,pandas,pivot-table,Pandas,Pivot Table,使用此数据帧 df = pd.DataFrame({"A": ["foo", "foo", "foo", "foo", "foo", "bar", "bar", "bar", "bar"], "B": ["one", "one", "one", "two", "two", "one", "one", "two", "two"],
df = pd.DataFrame({"A": ["foo", "foo", "foo", "foo", "foo",
"bar", "bar", "bar", "bar"],
"B": ["one", "one", "one", "two", "two",
"one", "one", "two", "two"],
"C": ["small", "large", "large", "small",
"small", "large", "small", "small",
"large"],
"D": [1, 2, 2, 3, 3, 4, 5, 6, 7],
"E": [2, 4, 5, 5, 6, 6, 8, 9, 9]})
'''
A B C D E
0 foo one small 1 2
1 foo one large 2 4
2 foo one large 2 5
3 foo two small 3 5
4 foo two small 3 6
5 bar one large 4 6
6 bar one small 5 8
7 bar two small 6 9
8 bar two large 7 9
'''
当我跑的时候
print(pd.pivot_table(df, values='C', index=['A', 'B'],
columns=['C'], aggfunc='count'))
根据A列和B列计算小/大的数量(例如,对于A,B=(foo,one)
我们在C列中有1个small
,2个large
)
这给了我错误
ValueError: Grouper for 'C' not 1-dimensional
问题是什么?如何解决?听起来你想要的实际上是一个groupby:
df.groupby(['A', 'B', 'C']).size()
A B C
bar one large 1
small 1
two large 1
small 1
foo one large 2
small 1
two small 2
dtype: int64
如果要将“C”放回列中,可以取消堆叠:
df.groupby(['A', 'B', 'C']).size().unstack().fillna(0)
C large small
A B
bar one 1.0 1.0
two 1.0 1.0
foo one 2.0 1.0
two 0.0 2.0
听起来你想要的其实是一个团员:
df.groupby(['A', 'B', 'C']).size()
A B C
bar one large 1
small 1
two large 1
small 1
foo one large 2
small 1
two small 2
dtype: int64
如果要将“C”放回列中,可以取消堆叠:
df.groupby(['A', 'B', 'C']).size().unstack().fillna(0)
C large small
A B
bar one 1.0 1.0
two 1.0 1.0
foo one 2.0 1.0
two 0.0 2.0
不能将C列同时作为值和列。
可能您应该改为:
print(pd.pivot_table(df, index=['A', 'B'], columns=['C'], aggfunc='count'))
结果是:
D E
C large small large small
A B
bar one 1.0 1.0 1.0 1.0
two 1.0 1.0 1.0 1.0
foo one 2.0 1.0 2.0 1.0
two NaN 2.0 NaN 2.0
不能将C列同时作为值和列。
可能您应该改为:
print(pd.pivot_table(df, index=['A', 'B'], columns=['C'], aggfunc='count'))
结果是:
D E
C large small large small
A B
bar one 1.0 1.0 1.0 1.0
two 1.0 1.0 1.0 1.0
foo one 2.0 1.0 2.0 1.0
two NaN 2.0 NaN 2.0