Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/android/194.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Pandas 如何向数据帧中所有重复的DatetimeIndex填充1ns?_Pandas_Dataframe - Fatal编程技术网

Pandas 如何向数据帧中所有重复的DatetimeIndex填充1ns?

Pandas 如何向数据帧中所有重复的DatetimeIndex填充1ns?,pandas,dataframe,Pandas,Dataframe,如何在Pandas数据帧中向所有重复的DatetimeIndex填充1ns 例如,从以下方面: 2016-11-13 20:00:10.617989120 2016-11-13 20:00:10.617989120 2016-11-13 20:00:10.617989120 2016-11-13 20:00:10.123945353 2016-11-13 20:00:14.565989314 2016-11-13 20:00:18.565989315 2016-11-13 20:00:18.56

如何在
Pandas数据帧
中向所有重复的
DatetimeIndex
填充1ns

例如,从以下方面:

2016-11-13 20:00:10.617989120
2016-11-13 20:00:10.617989120
2016-11-13 20:00:10.617989120
2016-11-13 20:00:10.123945353
2016-11-13 20:00:14.565989314
2016-11-13 20:00:18.565989315
2016-11-13 20:00:18.565989315
2016-11-13 20:00:18.565989315
为此:

2016-11-13 20:00:10.617989120
2016-11-13 20:00:10.617989121
2016-11-13 20:00:10.617989122
2016-11-13 20:00:10.123945353
2016-11-13 20:00:14.565989314
2016-11-13 20:00:18.565989315
2016-11-13 20:00:18.565989316
2016-11-13 20:00:18.565989317
您可以与convert一起使用:

您可以与convert一起使用:


嗯,我用cumsum试着得到同样的结果,但它当然不起作用。。我不知道,谢谢你!!嗯,我用cumsum试着得到同样的结果,但它当然不起作用。。我不知道,谢谢你!!
print (df.groupby(level=0).cumcount())
2016-11-13 20:00:10.617989120    0
2016-11-13 20:00:10.617989120    1
2016-11-13 20:00:10.617989120    2
2016-11-13 20:00:10.123945353    0
2016-11-13 20:00:14.565989314    0
2016-11-13 20:00:18.565989315    0
2016-11-13 20:00:18.565989315    1
2016-11-13 20:00:18.565989315    2
dtype: int64

df.index = df.index + pd.to_timedelta(df.groupby(level=0).cumcount())
print (df.index)
DatetimeIndex(['2016-11-13 20:00:10.617989120',
               '2016-11-13 20:00:10.617989121',
               '2016-11-13 20:00:10.617989122',
               '2016-11-13 20:00:10.123945353',
               '2016-11-13 20:00:14.565989314',
               '2016-11-13 20:00:18.565989315',
               '2016-11-13 20:00:18.565989316',
               '2016-11-13 20:00:18.565989317'],
              dtype='datetime64[ns]', freq=None)