合并多个dask.DataFrames更快?

合并多个dask.DataFrames更快?,dask,dask-distributed,dask-dataframe,Dask,Dask Distributed,Dask Dataframe,假设您有10个不同的ddf,它们都有相同的索引,是否有比这更快的选项将其全部合并在一起 # Pseudo code ddfs_to_be_merged = [] ddfs_to_be_marged.append(ddf_with_same_index1) ddfs_to_be_marged.append(ddf_with_same_index2) ddfs_to_be_marged.append(ddf_with_same_index3) target = dd.DataFrame() #

假设您有10个不同的
ddf
,它们都有相同的索引,是否有比这更快的选项将其全部合并在一起

# Pseudo code
ddfs_to_be_merged = []
ddfs_to_be_marged.append(ddf_with_same_index1)
ddfs_to_be_marged.append(ddf_with_same_index2)
ddfs_to_be_marged.append(ddf_with_same_index3)


target = dd.DataFrame() # some dataframe which share the same index
for df_to_be_merged in ddfs_to_be_marged:
   target = target.merge(df_to_be_merged, how='left', left_index=True, right_index=True)
print(target.head())
我的理解是,这类代码将以序列化方式运行(一个接一个地合并),而有一种更快的方法可以(以某种方式)一次性合并它(因为它们共享相同的索引)