Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/351.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 在新列前面的列中获取元素_Python_Pandas_Dataframe - Fatal编程技术网

Python 在新列前面的列中获取元素

Python 在新列前面的列中获取元素,python,pandas,dataframe,Python,Pandas,Dataframe,我正在尝试从列“data”中收集项目,该列正好位于我在列“min”中收集的数据之前,并创建新列。看 以下是数据(使用pd.read\u csv导入): 我的代码是: import pandas as pd import numpy as np from scipy import signal from scipy.signal import argrelextrema import datetime diff=pd.DataFrame() df=pd.read_csv('saw_data2.

我正在尝试从列“data”中收集项目,该列正好位于我在列“min”中收集的数据之前,并创建新列。看

以下是数据(使用pd.read\u csv导入):

我的代码是:

import pandas as pd
import numpy as np
from scipy import signal
from scipy.signal import argrelextrema
import datetime
diff=pd.DataFrame()


df=pd.read_csv('saw_data2.csv')
df['time']=pd.to_datetime(df['time'])

print(df.head())
n=2 # number of points to be checked before and after
# Find local peaks
df['min'] = df.iloc[argrelextrema(df.data.values, np.less_equal, order=n)[0]]['data']
如果你绘制数据,你会发现它类似于锯齿。我在“min”中得到的“data”中前面的元素是我想放在新列df['new_col']中的元素

我试过很多东西,比如

df['new_col']=df.index.get_loc(df['min'].df['data'])
以及


IIUC,在选择具有最小值的行之前,可以执行
shift

df['new_col'] = df.shift().loc[df['min'].notna(), 'data']
print (df)
                 time           data            min        new_col
0   12/15/18 01:10 AM  130352.146181  130352.146181            NaN
1   12/16/18 01:45 AM  130355.219097            NaN            NaN
2   12/17/18 01:47 AM  130358.223264            NaN            NaN
3   12/18/18 02:15 AM  130361.281701            NaN            NaN
4   12/19/18 03:15 AM  130364.406597            NaN            NaN
5   12/20/18 03:25 AM  130352.427431  130352.427431  130364.406597
6   12/21/18 03:27 AM  130355.431597            NaN            NaN
7   12/22/18 05:18 AM  130358.663542            NaN            NaN
8   12/23/18 06:44 AM  130361.842431            NaN            NaN
9   12/24/18 07:19 AM  130364.915243            NaN            NaN
10  12/25/18 07:33 AM  130352.944410  130352.944410  130364.915243
11  12/26/18 07:50 AM  130355.979826            NaN            NaN
12  12/27/18 09:13 AM  130359.153472            NaN            NaN
13  12/28/18 11:53 AM  130362.487187            NaN            NaN
14  12/29/18 01:23 PM  130365.673264            NaN            NaN
15  12/30/18 02:17 PM  130353.785764  130353.785764  130365.673264
16  12/31/18 02:23 PM  130356.798264            NaN            NaN
17  01/01/19 04:41 PM  130360.085764            NaN            NaN
18  01/02/19 05:01 PM  130363.128125            NaN            NaN

我不确定我是否在关注第三栏应该是什么。你能提供另一个数据集作为例子吗?嗨,泰勒。每次我从'data'中获取元素时,我都会将前一个元素放在'min'中,并将其放在新列'new_col'中。我可以试着画一些东西来更新帖子。我添加了一张图片来帮助解释。
df['new_col']=df['min'].shift() #obviously wrong
df['new_col'] = df.shift().loc[df['min'].notna(), 'data']
print (df)
                 time           data            min        new_col
0   12/15/18 01:10 AM  130352.146181  130352.146181            NaN
1   12/16/18 01:45 AM  130355.219097            NaN            NaN
2   12/17/18 01:47 AM  130358.223264            NaN            NaN
3   12/18/18 02:15 AM  130361.281701            NaN            NaN
4   12/19/18 03:15 AM  130364.406597            NaN            NaN
5   12/20/18 03:25 AM  130352.427431  130352.427431  130364.406597
6   12/21/18 03:27 AM  130355.431597            NaN            NaN
7   12/22/18 05:18 AM  130358.663542            NaN            NaN
8   12/23/18 06:44 AM  130361.842431            NaN            NaN
9   12/24/18 07:19 AM  130364.915243            NaN            NaN
10  12/25/18 07:33 AM  130352.944410  130352.944410  130364.915243
11  12/26/18 07:50 AM  130355.979826            NaN            NaN
12  12/27/18 09:13 AM  130359.153472            NaN            NaN
13  12/28/18 11:53 AM  130362.487187            NaN            NaN
14  12/29/18 01:23 PM  130365.673264            NaN            NaN
15  12/30/18 02:17 PM  130353.785764  130353.785764  130365.673264
16  12/31/18 02:23 PM  130356.798264            NaN            NaN
17  01/01/19 04:41 PM  130360.085764            NaN            NaN
18  01/02/19 05:01 PM  130363.128125            NaN            NaN