Python 如何在Pyspark中应用groupby和transpose?

Python 如何在Pyspark中应用groupby和transpose?,python,python-3.x,pyspark,pyspark-sql,pyspark-dataframes,Python,Python 3.x,Pyspark,Pyspark Sql,Pyspark Dataframes,我有一个如下所示的数据帧 df = pd.DataFrame({ 'subject_id':[1,1,1,1,2,2,2,2,3,3,4,4,4,4,4], 'readings' : ['READ_1','READ_2','READ_1','READ_3','READ_1','READ_5','READ_6','READ_8','READ_10','READ_12','READ_11','READ_14','READ_09','READ_08','READ_07'], 'val' :[5,6

我有一个如下所示的数据帧

df = pd.DataFrame({
'subject_id':[1,1,1,1,2,2,2,2,3,3,4,4,4,4,4],
 'readings' : ['READ_1','READ_2','READ_1','READ_3','READ_1','READ_5','READ_6','READ_8','READ_10','READ_12','READ_11','READ_14','READ_09','READ_08','READ_07'],
 'val' :[5,6,7,11,5,7,16,12,13,56,32,13,45,43,46], 
 })
我上面的输入数据框如下所示

尽管下面的代码在Python pandas中运行良好(感谢Jezrael),但当我将其应用于实际数据(超过400万条记录)时,它会运行很长时间。所以我尝试使用
pyspark
。请注意,我已经尝试了
Dask
modin
pandarallel
,它们相当于pandas进行大规模处理,但也没有帮助。下面的代码所做的是
它为每次阅读生成每个主题的摘要统计信息
。您可以查看下面的预期输出以获得想法

df_op = (df.groupby(['subject_id','readings'])['val']
        .describe()
        .unstack()
        .swaplevel(0,1,axis=1)
        .reindex(df['readings'].unique(), axis=1, level=0))
df_op.columns = df_op.columns.map('_'.join)
df_op = df_op.reset_index()
你能帮我在pyspark中完成上述操作吗?当我尝试下面的方法时,它抛出了一个错误

df.groupby(['subject_id','readings'])['val'] 
例如-subject_id=1有4个读数,但有3个唯一读数。所以我们得到了3*8=24列的subject_id=1。为什么是8?因为它是
MIN,MAX,COUNT,Std,MEAN,25%,50%,75%。希望这有帮助

当我在pyspark中开始使用它时,它返回以下错误

TypeError:“GroupedData”对象不可下标

我希望我的输出如下所示

df = pd.DataFrame({
'subject_id':[1,1,1,1,2,2,2,2,3,3,4,4,4,4,4],
 'readings' : ['READ_1','READ_2','READ_1','READ_3','READ_1','READ_5','READ_6','READ_8','READ_10','READ_12','READ_11','READ_14','READ_09','READ_08','READ_07'],
 'val' :[5,6,7,11,5,7,16,12,13,56,32,13,45,43,46], 
 })

您需要先分组并获得每次阅读的统计数据,然后再进行重点分析以获得预期结果

import pyspark.sql.functions as F

agg_df = df.groupby("subject_id", "readings").agg(F.mean(F.col("val")), F.min(F.col("val")), F.max(F.col("val")),
                                                    F.count(F.col("val")),
                                                    F.expr('percentile_approx(val, 0.25)').alias("quantile_25"),
                                                    F.expr('percentile_approx(val, 0.75)').alias("quantile_75"))
这将为您提供以下输出:

+----------+--------+--------+--------+--------+----------+-----------+-----------+
|subject_id|readings|avg(val)|min(val)|max(val)|count(val)|quantile_25|quantile_75|
+----------+--------+--------+--------+--------+----------+-----------+-----------+
|         2|  READ_1|     5.0|       5|       5|         1|          5|          5|
|         2|  READ_5|     7.0|       7|       7|         1|          7|          7|
|         2|  READ_8|    12.0|      12|      12|         1|         12|         12|
|         4| READ_08|    43.0|      43|      43|         1|         43|         43|
|         1|  READ_2|     6.0|       6|       6|         1|          6|          6|
|         1|  READ_1|     6.0|       5|       7|         2|          5|          7|
|         2|  READ_6|    16.0|      16|      16|         1|         16|         16|
|         1|  READ_3|    11.0|      11|      11|         1|         11|         11|
|         4| READ_11|    32.0|      32|      32|         1|         32|         32|
|         3| READ_10|    13.0|      13|      13|         1|         13|         13|
|         3| READ_12|    56.0|      56|      56|         1|         56|         56|
|         4| READ_14|    13.0|      13|      13|         1|         13|         13|
|         4| READ_07|    46.0|      46|      46|         1|         46|         46|
|         4| READ_09|    45.0|      45|      45|         1|         45|         45|
+----------+--------+--------+--------+--------+----------+-----------+-----------+
使用groupby
subject\u id
如果您透视
读数
,您将获得预期的输出:

agg_df2 = df.groupby("subject_id").pivot("readings").agg(F.mean(F.col("val")), F.min(F.col("val")), F.max(F.col("val")),
                                                         F.count(F.col("val")),
                                                         F.expr('percentile_approx(val, 0.25)').alias("quantile_25"),
                                                         F.expr('percentile_approx(val, 0.75)').alias("quantile_75"))

for i in agg_df2.columns:
    agg_df2 = agg_df2.withColumnRenamed(i, i.replace("(val)", ""))
agg_df2.show()

+----------+----------------+----------------+----------------+------------------+-------------------+-------------------+----------------+----------------+----------------+------------------+-------------------+-------------------+----------------+----------------+----------------+------------------+-------------------+-------------------+---------------+---------------+---------------+-----------------+------------------+------------------+----------------+----------------+----------------+------------------+-------------------+-------------------+----------------+----------------+----------------+------------------+-------------------+-------------------+----------------+----------------+----------------+------------------+-------------------+-------------------+----------------+----------------+----------------+------------------+-------------------+-------------------+---------------+---------------+---------------+-----------------+------------------+------------------+---------------+---------------+---------------+-----------------+------------------+------------------+---------------+---------------+---------------+-----------------+------------------+------------------+---------------+---------------+---------------+-----------------+------------------+------------------+---------------+---------------+---------------+-----------------+------------------+------------------+
|subject_id|READ_07_avg(val)|READ_07_min(val)|READ_07_max(val)|READ_07_count(val)|READ_07_quantile_25|READ_07_quantile_75|READ_08_avg(val)|READ_08_min(val)|READ_08_max(val)|READ_08_count(val)|READ_08_quantile_25|READ_08_quantile_75|READ_09_avg(val)|READ_09_min(val)|READ_09_max(val)|READ_09_count(val)|READ_09_quantile_25|READ_09_quantile_75|READ_1_avg(val)|READ_1_min(val)|READ_1_max(val)|READ_1_count(val)|READ_1_quantile_25|READ_1_quantile_75|READ_10_avg(val)|READ_10_min(val)|READ_10_max(val)|READ_10_count(val)|READ_10_quantile_25|READ_10_quantile_75|READ_11_avg(val)|READ_11_min(val)|READ_11_max(val)|READ_11_count(val)|READ_11_quantile_25|READ_11_quantile_75|READ_12_avg(val)|READ_12_min(val)|READ_12_max(val)|READ_12_count(val)|READ_12_quantile_25|READ_12_quantile_75|READ_14_avg(val)|READ_14_min(val)|READ_14_max(val)|READ_14_count(val)|READ_14_quantile_25|READ_14_quantile_75|READ_2_avg(val)|READ_2_min(val)|READ_2_max(val)|READ_2_count(val)|READ_2_quantile_25|READ_2_quantile_75|READ_3_avg(val)|READ_3_min(val)|READ_3_max(val)|READ_3_count(val)|READ_3_quantile_25|READ_3_quantile_75|READ_5_avg(val)|READ_5_min(val)|READ_5_max(val)|READ_5_count(val)|READ_5_quantile_25|READ_5_quantile_75|READ_6_avg(val)|READ_6_min(val)|READ_6_max(val)|READ_6_count(val)|READ_6_quantile_25|READ_6_quantile_75|READ_8_avg(val)|READ_8_min(val)|READ_8_max(val)|READ_8_count(val)|READ_8_quantile_25|READ_8_quantile_75|
+----------+----------------+----------------+----------------+------------------+-------------------+-------------------+----------------+----------------+----------------+------------------+-------------------+-------------------+----------------+----------------+----------------+------------------+-------------------+-------------------+---------------+---------------+---------------+-----------------+------------------+------------------+----------------+----------------+----------------+------------------+-------------------+-------------------+----------------+----------------+----------------+------------------+-------------------+-------------------+----------------+----------------+----------------+------------------+-------------------+-------------------+----------------+----------------+----------------+------------------+-------------------+-------------------+---------------+---------------+---------------+-----------------+------------------+------------------+---------------+---------------+---------------+-----------------+------------------+------------------+---------------+---------------+---------------+-----------------+------------------+------------------+---------------+---------------+---------------+-----------------+------------------+------------------+---------------+---------------+---------------+-----------------+------------------+------------------+
|         1|            null|            null|            null|              null|               null|               null|            null|            null|            null|              null|               null|               null|            null|            null|            null|              null|               null|               null|            6.0|              5|              7|                2|                 5|                 7|            null|            null|            null|              null|               null|               null|            null|            null|            null|              null|               null|               null|            null|            null|            null|              null|               null|               null|            null|            null|            null|              null|               null|               null|            6.0|              6|              6|                1|                 6|                 6|           11.0|             11|             11|                1|                11|                11|           null|           null|           null|             null|              null|              null|           null|           null|           null|             null|              null|              null|           null|           null|           null|             null|              null|              null|
|         3|            null|            null|            null|              null|               null|               null|            null|            null|            null|              null|               null|               null|            null|            null|            null|              null|               null|               null|           null|           null|           null|             null|              null|              null|            13.0|              13|              13|                 1|                 13|                 13|            null|            null|            null|              null|               null|               null|            56.0|              56|              56|                 1|                 56|                 56|            null|            null|            null|              null|               null|               null|           null|           null|           null|             null|              null|              null|           null|           null|           null|             null|              null|              null|           null|           null|           null|             null|              null|              null|           null|           null|           null|             null|              null|              null|           null|           null|           null|             null|              null|              null|
|         2|            null|            null|            null|              null|               null|               null|            null|            null|            null|              null|               null|               null|            null|            null|            null|              null|               null|               null|            5.0|              5|              5|                1|                 5|                 5|            null|            null|            null|              null|               null|               null|            null|            null|            null|              null|               null|               null|            null|            null|            null|              null|               null|               null|            null|            null|            null|              null|               null|               null|           null|           null|           null|             null|              null|              null|           null|           null|           null|             null|              null|              null|            7.0|              7|              7|                1|                 7|                 7|           16.0|             16|             16|                1|                16|                16|           12.0|             12|             12|                1|                12|                12|
|         4|            46.0|              46|              46|                 1|                 46|                 46|            43.0|              43|              43|                 1|                 43|                 43|            45.0|              45|              45|                 1|                 45|                 45|           null|           null|           null|             null|              null|              null|            null|            null|            null|              null|               null|               null|            32.0|              32|              32|                 1|                 32|                 32|            null|            null|            null|              null|               null|               null|            13.0|              13|              13|                 1|                 13|                 13|           null|           null|           null|             null|              null|              null|           null|           null|           null|             null|              null|              null|           null|           null|           null|             null|              null|              null|           null|           null|           null|             null|              null|              null|           null|           null|           null|             null|              null|              null|
+----------+----------------+----------------+----------------+------------------+-------------------+-------------------+----------------+----------------+----------------+------------------+-------------------+-------------------+----------------+----------------+----------------+------------------+-------------------+-------------------+---------------+---------------+---------------+-----------------+------------------+------------------+----------------+----------------+----------------+------------------+-------------------+-------------------+----------------+----------------+----------------+------------------+-------------------+-------------------+----------------+----------------+----------------+------------------+-------------------+-------------------+----------------+----------------+----------------+------------------+-------------------+-------------------+---------------+---------------+---------------+-----------------+------------------+------------------+---------------+---------------+---------------+-----------------+------------------+------------------+---------------+---------------+---------------+-----------------+------------------+------------------+---------------+---------------+---------------+-----------------+------------------+------------------+---------------+---------------+---------------+-----------------+------------------+------------------+



你到底想做什么?代码不是很好的参考,因为它不是working@pissall-不,这是一个工作代码。你能再试一次吗?我想做的是获得每次阅读的每个主题的
摘要统计数据
@pissall-在帖子中用
subject\u id=1
的示例进行更新。如果您还有任何疑问,请告诉我您想要什么样的汇总统计数据?您必须使用GroupBy编写聚合代码。我将为您解决此问题,只需使用第二步bro。第一步是演示如何获取统计数据。刚刚注意到数据帧名称,实际上我正在尝试使用
agg_df2.show()
查看输出。由于我的真实数据为空值,无法验证
显示
输出。是否可以像数据框一样以漂亮的表格形式查看此内容?我尝试了这个
result\u pdf=agg\u df2.select(“*”).toPandas()
,但它运行了很长时间,最后导致了错误。我指的是展示。可能是因为数据量大?如何以预期输出部分所示的表格形式查看输出?是的,大型数据集上的
toPandas()
非常昂贵,因为它会将您的所有数据带到执行器上。检查执行器内存是否存在相同的错误。