Python Pyspark-如何根据dataframe2中的列值在DataFrame1中插入记录

Python Pyspark-如何根据dataframe2中的列值在DataFrame1中插入记录,python,pyspark-sql,pyspark-dataframes,Python,Pyspark Sql,Pyspark Dataframes,我需要使用pyspark的spark.sql()将记录插入另一个表(比如table2)中的表1。目前我可以通过join获得一条记录,但我需要根据第二个表将尽可能多的记录插入到表1中 我在这里提供示例数据帧: df1= sqlContext.createDataFrame([("xxx1","81A01","TERR NAME 01"),("xxx1","81A01","TERR NAME 02"), ("xxx1","81A01","TERR NAME 03")], ["zip_code","z

我需要使用pyspark的spark.sql()将记录插入另一个表(比如table2)中的表1。目前我可以通过join获得一条记录,但我需要根据第二个表将尽可能多的记录插入到表1中

我在这里提供示例数据帧:

df1= sqlContext.createDataFrame([("xxx1","81A01","TERR NAME 01"),("xxx1","81A01","TERR NAME 02"), ("xxx1","81A01","TERR NAME 03")], ["zip_code","zone_code","territory_name"])
df2= sqlContext.createDataFrame([("xxx1","81A01","","NY")], ["zip_code","zone_code","territory_name","state"])

df1.show()
+--------+--------------+--------------+
|zip_code|zone_code     |territory_name|
+--------+--------------+--------------+
|    xxx1|         81A01|  TERR NAME 01|
|    xxx1|         81A01|  TERR NAME 02|
|    xxx1|         81A01|  TERR NAME 03|
+---------------------------------------

# Print out information about this data
df2.show()
+--------+--------------+--------------+-----+
|zip_code|zone_code     |territory_name|state|
+--------+--------------+--------------+-----+     
|    xxx1|         81A01|  null        |   NY|
+---------------------------------------------
在上面的示例中,我需要基于邮政编码将df2与df1连接起来,并获得与df1中的territory_名称一样多的记录

df2的预期结果是:

+--------+--------------+--------------+-----+
|zip_code|zone_code     |territory_name|state|
+--------+--------------+--------------+-----+     
|    xxx1|         81A01|  TERR NAME 01|   NY|
|    xxx1|         81A01|  TERR NAME 02|   NY|
|    xxx1|         81A01|  TERR NAME 03|   NY|
+---------------------------------------------
需要帮助请,目前我可以通过加入获得一条记录

Spark.sql query sample for getting one record:
    df1.createOrReplaceTempView('df1')
    df2.createOrReplaceTempView('df2')
    spark.sql("select a.zip_code,a.zone_code,b.territory_name,a.state from df1 a 
    left join df2 b on a.zip_code = b.zip_code where a.territory_name is null").createOrReplaceTempView('df2')

谢谢

希望提供代码片段,因此可能对某些人有用

df1= sqlContext.createDataFrame([("xxx1","81A01","TERR NAME 01"),("xxx1","81A01","TERR NAME 02"), ("xxx1","81A01","TERR NAME 03")], ["zip_code","zone_code","territory_name"])
df2= sqlContext.createDataFrame([("xxx1","","","NY"), ("xxx1","","TERR NAME 99","NY")], ["zip_code","zone_code","territory_name","state"])

df1.createOrReplaceTempView('df1')
df2.createOrReplaceTempView('df2')

spark.sql(“select * from df1”)
+--------+---------+--------------+ 
|zip_code|zone_code|territory_name| 
+--------+---------+--------------+ 
| xxx1   | 81A01   | TERR NAME 01 | 
| xxx1   | 81A01   | TERR NAME 02 | 
| xxx1   | 81A01   | TERR NAME 03 | 
+--------+---------+--------------+ 

spark.sql(“select * from df2”)
+--------+---------+--------------+-----+ 
|zip_code|zone_code|territory_name|state| 
+--------+---------+--------------+-----+ 
| xxx1   |         |              | NY  | 
| xxx1   |         | TERR NAME 99 | NY  | 
+--------+---------+--------------+-----+

spark.sql("""select a.zip_code, b.zone_code, b.territory_name, a.state from df2 a 
            left join df1 b 
            on a.zip_code = b.zip_code 
            where a.territory_name = ''
            UNION
            select a.zip_code, b.zone_code, a.territory_name, a.state from df2 a 
            left join df1 b 
            on a.zip_code = b.zip_code 
            where a.territory_name != ''
            """).createOrReplaceTempView('df3')


spark.sql(“select * from df3”)
+--------+---------+--------------+-----+ 
|zip_code|zone_code|territory_name|state| 
+--------+---------+--------------+-----+ 
| xxx1   | 81A01   | TERR NAME 03 | NY  | 
| xxx1   | 81A01   | TERR NAME 99 | NY  |  
| xxx1   | 81A01   | TERR NAME 01 | NY  | 
| xxx1   | 81A01   | TERR NAME 02 | NY  | 
+--------+---------+--------------+-----+

感谢那些帮助过你的人。

你可以尝试一个完整的外部连接,例如
df2.join(df1,['zone\u code'],how='full').show()
谢谢,由于一些开发设计策略,我需要在这个项目中使用spark.sql,有什么帮助吗?你能分享你当前用于连接的代码吗?嗨,我为上面的df添加了示例spark.sql,因为我不能在这里提供原始代码段。谢谢。谢谢你的代码片段。这能告诉你你想要什么吗
spark.sql(“从df1中选择a.zip\u代码、a.zone\u代码、a.territory\u名称、b.state a左连接a.zip\u代码=b.zip\u代码,其中b.territory\u名称=”)。createOrReplaceTempView('df3')
然后
spark.table(“df3”).show()