Apache spark SPARK SQL抛出断言错误:断言失败:找到重复的重写属性(SPARK 3.0.2)

Apache spark SPARK SQL抛出断言错误:断言失败:找到重复的重写属性(SPARK 3.0.2),apache-spark,apache-spark-sql,spark3,Apache Spark,Apache Spark Sql,Spark3,在Spark 3.0.2中执行上述操作会产生 线程“main”java.lang.AssertionError中出现异常:断言失败:发现重复的重写属性。 它在Spark 2.4.3中工作 SELECT COALESCE(view_1_alias.name, view_2.name) AS name, COALESCE(view_1_alias.id, view_2.id) AS id, COALESCE(view_1_alias.second_id, view_2.

在Spark 3.0.2中执行上述操作会产生 线程“main”java.lang.AssertionError中出现异常:断言失败:发现重复的重写属性。 它在Spark 2.4.3中工作

SELECT 
    COALESCE(view_1_alias.name, view_2.name) AS name, 
    COALESCE(view_1_alias.id, view_2.id) AS id, 
    COALESCE(view_1_alias.second_id, view_2.second_id) AS second_id, 
    COALESCE(view_1_alias.local_timestamp, view_2.local_timestamp) AS local_timestamp, 
    COALESCE(view_1_alias.utc_timestamp, view_2.utc_timestamp) AS utc_timestamp, 
    view_1_alias.alias_1_column_1, 
    view_1_alias.alias_1_column_2, 
    view_1_alias.alias_1_column_3, 
    view_1_alias.alias_1_column_4, 
    view_1_alias.alias_1_column_5, 
    view_1_alias.alias_1_column_6, 
    view_1_alias.alias_1_column_7, 
    view_2.alias_2_coumn_1 
FROM 
    view_1 view_1_alias FULL OUTER JOIN view_2 
ON 
    view_1_alias.name = view_2.name AND 
    view_1_alias.id = view_2.id AND 
    view_1_alias.second_id = view_2.second_id AND
    view_1_alias.local_timestamp = view_2.local_timestamp;
视图的标题\u 1

|名称| id |第二个| id |本地|时间戳| utc |时间戳|别名|列| 1 |列| 2 |别名|列| 3 |别名|列| 4 |别名|列| 5 |别名|列|别名|列| 6 |别名|列

视图的标题\u 2

| name | id | second | id | local | timestamp | utc | timestamp | alias | u 2 | u coumn |

有趣的是,我们有几个这样的查询,但只有一个具体给出了问题。 我看到了这个问题,但不确定如何将其转换为spark sql