Scala SparkSession不存在
我试图使用Scala 2.11.12在Spark 2.3.0中定义一个udf。在我看来,我需要通过阅读 但是我不能导入这个对象Scala SparkSession不存在,scala,apache-spark,Scala,Apache Spark,我试图使用Scala 2.11.12在Spark 2.3.0中定义一个udf。在我看来,我需要通过阅读 但是我不能导入这个对象 import org.apache.spark.sql.SparkSession 导致: Error:(2, 8) object SparkSession is not a member of package org.apache.spark.sql import org.apache.spark.sql.SparkSession 这是我的build.sbt: na
import org.apache.spark.sql.SparkSession
导致:
Error:(2, 8) object SparkSession is not a member of package org.apache.spark.sql
import org.apache.spark.sql.SparkSession
这是我的build.sbt:
name := "webtrends-processing-scala"
version := "0.1"
scalaVersion := "2.11.12"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.3"
libraryDependencies += "io.lemonlabs" %% "scala-uri" % "1.4.3"
您必须包括spark sql相关性:
libraryDependencies ++= Seq("org.apache.spark" %% "spark-core" % "2.3.0",
"org.apache.spark" %% "spark-sql" % "2.3.0")