Pyspark 将SQL Case语句转换为Spark
如何将此SQL case语句转换为Spark SQLPyspark 将SQL Case语句转换为Spark,pyspark,apache-spark-sql,Pyspark,Apache Spark Sql,如何将此SQL case语句转换为Spark SQL replace_old_engagements_sql = """ UPDATE """ + my_table_name + """ SET Engagement = CASE Engagement WHEN '800000026680' THEN '800000032764'
replace_old_engagements_sql = """ UPDATE """ + my_table_name + """
SET Engagement = CASE Engagement
WHEN '800000026680' THEN '800000032764'
WHEN '807000000041' THEN '808000000000'
WHEN '870000012569' THEN '807000000412'
WHEN '807000000279' THEN '808000000223'
WHEN '807000000282' THEN '808000000223'
WHEN '870000000403' THEN '808000000223'
END
WHERE LinkedAccountId in ('123456789101','109876543212') AND Engagement IN ('800000026680', '807000000041', '870000012569', '807000000279', '807000000282', '870000000403'); """
我想你的spark sql应该与此相近
spark.sql("""
INSERT OVERWRITE TABLE db.my_table_name
SELECT
CASE
WHEN LinkedAccountId in ('123456789101','109876543212') THEN
CASE
WHEN Engagement = '800000026680' THEN '800000032764'
WHEN Engagement = '807000000041' THEN '808000000000'
WHEN Engagement = '870000012569' THEN '807000000412'
WHEN Engagement = '807000000279' THEN '808000000223'
WHEN Engagement = '807000000282' THEN '808000000223'
WHEN Engagement = '870000000403' THEN '808000000223'
ELSE Engagement
END
ELSE Engagement
END as Engagement
from db.my_table_name
""")
我想你的spark sql应该与此相近
spark.sql("""
INSERT OVERWRITE TABLE db.my_table_name
SELECT
CASE
WHEN LinkedAccountId in ('123456789101','109876543212') THEN
CASE
WHEN Engagement = '800000026680' THEN '800000032764'
WHEN Engagement = '807000000041' THEN '808000000000'
WHEN Engagement = '870000012569' THEN '807000000412'
WHEN Engagement = '807000000279' THEN '808000000223'
WHEN Engagement = '807000000282' THEN '808000000223'
WHEN Engagement = '870000000403' THEN '808000000223'
ELSE Engagement
END
ELSE Engagement
END as Engagement
from db.my_table_name
""")
谢谢您。命令开头的“val”表示什么。是斯卡拉吗?是的,你说得对。是斯卡拉。它定义了一个不可变的内容。谢谢您。命令开头的“val”表示什么。是斯卡拉吗?是的,你说得对。是斯卡拉。它定义了一个不可变的内容。