Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/281.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 如何在分组数据中找到上面每一行的总和?_Python_Pandas_Numpy - Fatal编程技术网

Python 如何在分组数据中找到上面每一行的总和?

Python 如何在分组数据中找到上面每一行的总和?,python,pandas,numpy,Python,Pandas,Numpy,我有一个df AccountID PurchaseDate Price | 113 2018-09-01 22:56:30 13| | 114 2018-09-03 22:57:30 23| | 113 2018-09-02 22:56:30 19| | 114 2018-09-01 2

我有一个
df

 AccountID       PurchaseDate                 Price
    | 113        2018-09-01 22:56:30              13|
    | 114        2018-09-03 22:57:30              23|
    | 113        2018-09-02 22:56:30              19|
    | 114        2018-09-01 22:56:30              20|
    | 114        2018-09-03 22:56:30              25|
我的
AccountID
已在
groupby()中
如何创建一个新列
TotalPurchase
,该列包含
Price
的总和,但仅针对该行日期之前相同的
AccountID
PurchaseDate

AccountID       PurchaseDate                 Price          TotalPurchase
| 113        2018-09-01 22:56:30              13               0  |
| 113        2018-09-02 22:56:30              19               13 |
| 114        2018-09-01 22:56:30              20               0  |
| 114        2018-09-03 22:56:30              25               20 |
| 114        2018-09-03 22:57:30              23               45 |

使用
shift()
cumsum()
尝试以下操作:

输出:

  AccountID PurchaseDate        Price   TotalPurchase
0   113     2018-09-01 22:56:30 13     0.0
2   113     2018-09-02 22:56:30 19     13.0
3   114     2018-09-01 22:56:30 20     0.0
4   114     2018-09-03 22:56:30 25     20.0
1   114     2018-09-03 22:57:30 23     45.0
  AccountID PurchaseDate        Price   TotalPurchase
0   113     2018-09-01 22:56:30 13     0.0
2   113     2018-09-02 22:56:30 19     13.0
3   114     2018-09-01 22:56:30 20     0.0
4   114     2018-09-03 22:56:30 25     20.0
1   114     2018-09-03 22:57:30 23     45.0