Scala 如何使用case和match定义数据帧?

Scala 如何使用case和match定义数据帧?,scala,apache-spark,apache-spark-sql,Scala,Apache Spark,Apache Spark Sql,我想用两种可能的方法之一创建Spark DataFramedf: val dataSourceType = "option1" dataSourceType.map{ case "option1" => { val df = gu .retrieveFromElastic(spark, source_field) } case "option1" => { val df = gu .re

我想用两种可能的方法之一创建Spark DataFrame
df

val dataSourceType = "option1"

dataSourceType.map{
    case "option1" => {
        val df = gu
          .retrieveFromElastic(spark, source_field)
    }
    case "option1" => {
        val df = gu
          .retrieveFromCSV(spark, source_field)
    }
}

// some operations on "df"

问题是
df
case
语句之外不可见。正确的处理方法是什么?

只是一些小的语法更改,您应该这样做

val df = dataSourceType match {
    case "option1" => gu.retrieveFromElastic(spark, source_field)
    case "option2" => gu.retrieveFromCSV(spark, source_field)
}

祝你好运:-)

只是一些小的语法更改,你应该像这样得到它

val df = dataSourceType match {
    case "option1" => gu.retrieveFromElastic(spark, source_field)
    case "option2" => gu.retrieveFromCSV(spark, source_field)
}
祝你好运:-)