Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/scala/16.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/svg/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala 将字符串转换为类型column Spark_Scala_Apache Spark - Fatal编程技术网

Scala 将字符串转换为类型column Spark

Scala 将字符串转换为类型column Spark,scala,apache-spark,Scala,Apache Spark,代码: 我正试图用上面的函数从现有的原始DF创建一个新的已处理DF 代码: 错误: var processedDF = func(rawDF,"col1","col2") :73:错误:类型不匹配; 找到:字符串(“col1”) 必需:org.apache.spark.sql.Column var processedDF=func(rawDF,“col1”、“col2”) ^ 关于如何将函数参数的类型从String更改为org.apache.spark.sql.Column的任何建议 &l

代码:

我正试图用上面的函数从现有的原始DF创建一个新的已处理DF

代码:

错误:

var processedDF  = func(rawDF,"col1","col2")
:73:错误:类型不匹配;
找到:字符串(“col1”)
必需:org.apache.spark.sql.Column
var processedDF=func(rawDF,“col1”、“col2”)
^
关于如何将函数参数的类型从String更改为org.apache.spark.sql.Column的任何建议

<console>:73: error: type mismatch;
found   : String("col1")
required: org.apache.spark.sql.Column
   var processedDF  = func(rawDF,"col1","col2")
                                     ^

或者直接通过
$
提供
(其中
spark
SparkSession
对象)

符号

import spark.implicits.StringToColumn

func(rawDF, $"col1", $"col2")
import org.apache.spark.sql.functions.col

func(rawDF, col("col1"), col("col2"))
func(rawDF, rawDF("col1"), rawDF("col2"))
import spark.implicits.StringToColumn

func(rawDF, $"col1", $"col2")
import spark.implicits.symbolToColumn

func(rawDF, 'col1, 'col2)