Apache spark 对象ml不是包org.apache.spark的成员

Apache spark 对象ml不是包org.apache.spark的成员,apache-spark,sbt,apache-spark-mllib,Apache Spark,Sbt,Apache Spark Mllib,我正在尝试在Mllib中迈出第一步。我的库依赖项是: libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.2.0" libraryDependencies += "com.github.fommil.netlib" % "all" % "1.1.2" 但我仍然得到: Error:(4, 27) object ml is not a member of package org.apache.spark impo

我正在尝试在Mllib中迈出第一步。我的库依赖项是:

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.2.0"
libraryDependencies += "com.github.fommil.netlib" % "all" % "1.1.2"
但我仍然得到:

Error:(4, 27) object ml is not a member of package org.apache.spark
  import org.apache.spark.ml.regression.GeneralizedLinearRegression

您已经包含了Spark Core的依赖项,还应该添加Spark MLLib:

libraryDependencies += "org.apache.spark" % "spark-mllib_2.11" % "2.2.0"
或更简单:

libraryDependencies += "org.apache.spark" %% "spark-mllib" % "2.2.0"

供你们参考:我删除了最后一句话,因为它离题了。最好不要冒险关门;)这给了我一个错误:
导入SBT项目时出错:
……
[warn]https://repo1.maven.org/maven2/org/apache/spark/spark-mllib_2.12 /2.2.0[警告]:org.apache.spark#spark-mllib_2.12;2.2.0:未找到
@user1761806将您的Scala版本更改为2.11。Spark不支持Scala 2.11 yetThanks必须将其专门更改为2.11.7才能正常工作,但现在似乎已经解决了这个问题。这类信息应位于的“依赖项”部分