Pyspark:添加平均值为groupby的列

Pyspark:添加平均值为groupby的列,pyspark,apache-spark-sql,Pyspark,Apache Spark Sql,我有这样一个数据帧: test = spark.createDataFrame( [ (1, 0, 100), (2, 0, 200), (3, 1, 150), (4, 1, 250), ], ['id', 'flag', 'col1'] ) 我想创建另一列并输入标志的groupby的平均值 test.groupBy(f.col('flag')).agg(f.avg(f.col("col1

我有这样一个数据帧:

test = spark.createDataFrame(
    [
        (1, 0, 100), 
        (2, 0, 200),
        (3, 1, 150), 
        (4, 1, 250),
    ],
    ['id', 'flag', 'col1'] 
)
我想创建另一列并输入标志的groupby的平均值

test.groupBy(f.col('flag')).agg(f.avg(f.col("col1"))).show()

+----+---------+
|flag|avg(col1)|
+----+---------+
|   0|    150.0|
|   1|    200.0|
+----+---------+
最终产品:

+---+----+----+---+
| id|flag|col1|avg|
+---+----+----+---+
|  1|   0| 100|150|
|  2|   0| 200|150|
|  3|   1| 150|200|
|  4|   1| 250|200|
+---+----+----+---+

您可以使用
窗口
功能:

from pyspark.sql.window import Window
from pyspark.sql import functions as F

w = Window.partitionBy('flag')
test.withColumn("avg", F.avg("col1").over(w)).show()

+---+----+----+-----+                                                           
| id|flag|col1|  avg|
+---+----+----+-----+
|  1|   0| 100|150.0|
|  2|   0| 200|150.0|
|  3|   1| 150|200.0|
|  4|   1| 250|200.0|
+---+----+----+-----+

这回答了你的问题吗?