带Spark Scala和Sbt的IntelliJ设置

带Spark Scala和Sbt的IntelliJ设置,scala,apache-spark,sbt,Scala,Apache Spark,Sbt,我正在IntelliJ中进行spark+Scala+SBT项目设置 Scala Version: 2.12.8 SBT Version: 1.4.2 Java Version: 1.8 Build.sbt文件: name := "Spark_Scala_Sbt" version := "0.1" scalaVersion := "2.12.8" libraryDependencies

我正在IntelliJ中进行spark+Scala+SBT项目设置

    Scala Version: 2.12.8
    SBT  Version:  1.4.2
    Java Version:  1.8
Build.sbt文件:

  name := "Spark_Scala_Sbt"
  version := "0.1"
   scalaVersion := "2.12.8"
    libraryDependencies ++= Seq(
   "org.apache.spark" %% "spark-core" % "2.3.3",
  "org.apache.spark" %% "spark-sql" % "2.3.3"
)
Scala文件:

import org.apache.spark.sql.SparkSession

object FirstSparkApplication extends App {
  val spark = SparkSession.builder
    .master("local[*]")
    .appName("Sample App")
    .getOrCreate()
  val data = spark.sparkContext.parallelize(
Seq("I like Spark", "Spark is awesome", "My first Spark job is working now and is counting down these words")
  )
  val filtered = data.filter(line => line.contains("awesome"))
  filtered.collect().foreach(print)
}
但它会显示以下错误消息:

1. Cannot resolve symbol apache.
2. Cannot resolve symbol SparkSession
3. Cannot resolve symbol sparkContext
4. Cannot resolve symbol filter.
5. Cannot resolve symbol collect.
6. Cannot resolve symbol contains.

我应该在这里更改什么?

在运行项目之前,您是否重新构建了项目?Spark的库是否已加载到左侧project explorer的外部库下拉选项卡中?是否使用sbt插件?它允许轻松执行spark程序…@Coursal是的,当我执行重建模块时,它抛出了相同的错误。@gianluca否我第一次使用intellIiJ和Sbt,所以我正在跟踪一些文档并尝试设置环境。在创建Sbt文件结构时可能发生了一些事情,您是否可以编辑您的问题,以便在“外部库”选项卡打开的情况下,在左侧添加project explorer的屏幕截图?