Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/scala/16.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala 这里使用when子句有什么不对_Scala_Apache Spark_Apache Spark Sql - Fatal编程技术网

Scala 这里使用when子句有什么不对

Scala 这里使用when子句有什么不对,scala,apache-spark,apache-spark-sql,Scala,Apache Spark,Apache Spark Sql,有如下场景: val df = // dataset have columns like "col", "col_a", "col_b". val type = "" //dynamicLLT passed by user //based on this, need to add one more colum "value" based "col" value. val valueDs = df .withColumn("type", lit(type).c

有如下场景:

 val df = // dataset have columns like "col", "col_a",  "col_b".
  val  type = "" //dynamicLLT passed by user 
 //based on this, need to add one more colum "value" based "col" value.


 val valueDs =  df
        .withColumn("type", lit(type).cast(StringType))
         .withColumn("value", 
                 when(col("cal").equalTo(lit("A_B")),concat_ws("_",col("col_a"), col("col_b"))).
                 when(col("cal").equalTo(lit("A")),concat(col("col_a")))
                );
需要根据类型选择其他列,并适当填充值列

但是当我运行colcal时如果由于字段列b不可用而失败

那么这里出了什么问题?为什么它在寻找不在那里的Colu_b。 如何修复when子句。

您可以尝试下面的方法

val valueDs =  df
        .withColumn("type", lit(type).cast(StringType))
         .withColumn("value", 
                 when((col("cal") === "A_B"),concat_ws("_",col("col_a"), col("col_b"))).
                 when((col("cal") === "A"),concat(col("col_a")))
                .otherwise("null")
         );

好的,它不会找Colcolu_b。。首先,将解析、分析并执行查询。在分析阶段,您的查询有一些数据帧没有的附加列。这里它将抛出一个异常-org.apache.spark.sql.AnalysisException:无法解析给定输入列的“col_b”:[…]when子句将被重写为-CASE when cal=A_b然后concat_ws_,col_A,col_b when cal=A然后concatcol_A END