Python 如何在pyspark中创建包含两个数据帧列的字典?
我有一个包含两列的数据框架,如下所示:Python 如何在pyspark中创建包含两个数据帧列的字典?,python,pyspark,Python,Pyspark,我有一个包含两列的数据框架,如下所示: df = spark.createDataFrame([('A', 'Science'), ('A', 'Math'), ('A', 'Physics'), ('B', 'Science'), ('B', 'English'), ('C', 'Math'), ('C', 'English'), ('C', 'Latin')], ['Group', 'Subjects']) Group Subjects A Sci
df = spark.createDataFrame([('A', 'Science'),
('A', 'Math'),
('A', 'Physics'),
('B', 'Science'),
('B', 'English'),
('C', 'Math'),
('C', 'English'),
('C', 'Latin')],
['Group', 'Subjects'])
Group Subjects
A Science
A Math
A Physics
B Science
B English
C Math
C English
C Latin
我需要为Group列中的每个唯一值遍历这些数据,并执行一些处理。我正在考虑创建一个字典,每个组的名称作为键,相应的主题列表作为值
因此,我的预期输出如下所示:
{A:['Science', 'Math', 'Physics'], B:['Science', 'English'], C:['Math', 'English', 'Latin']}
如何在pyspark中实现这一点?请查看:您可以执行
groupBy
并使用collect\u list
#Input DF
# +-----+-------+
# |group|subject|
# +-----+-------+
# | A| Math|
# | A|Physics|
# | B|Science|
# +-----+-------+
df1 = df.groupBy("group").agg(F.collect_list("subject").alias("subject")).orderBy("group")
df1.show(truncate=False)
# +-----+---------------+
# |group|subject |
# +-----+---------------+
# |A |[Math, Physics]|
# |B |[Science] |
# +-----+---------------+
dict = {row['group']:row['subject'] for row in df1.collect()}
print(dict)
# {'A': ['Math', 'Physics'], 'B': ['Science']}
如果您需要唯一的主题,则可以使用collect_set,否则请使用collect_list
import pyspark.sql.functions as F
df = spark.createDataFrame([('A', 'Science'),
('A', 'Math'),
('A', 'Physics'),
('B', 'Science'),
('B', 'English'),
('C', 'Math'),
('C', 'English'),
('C', 'Latin')],
['Group', 'Subjects'])
df_tst=df.groupby('Group').agg(F.collect_set("Subjects").alias('Subjects')).withColumn("dict",F.create_map('Group',"Subjects"))
结果:
+-----+------------------------+-------------------------------+
|Group|Subjects |dict |
+-----+------------------------+-------------------------------+
|C |[Math, Latin, English] |[C -> [Math, Latin, English]] |
|B |[Science, English] |[B -> [Science, English]] |
|A |[Math, Physics, Science]|[A -> [Math, Physics, Science]]|
+-----+------------------------+-------------------------------+