Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala 尝试为apache spark编译示例tfidf代码时未找到HashingTF_Scala_Apache Spark_Apache Spark Mllib - Fatal编程技术网

Scala 尝试为apache spark编译示例tfidf代码时未找到HashingTF

Scala 尝试为apache spark编译示例tfidf代码时未找到HashingTF,scala,apache-spark,apache-spark-mllib,Scala,Apache Spark,Apache Spark Mllib,在尝试编译上述代码段时,我遇到以下错误 import org.apache.spark.rdd.RDD import org.apache.spark.SparkContext import org.apache.spark.mllib.feature.HashingTF import org.apache.spark.mllib.linalg.Vector val sc: SparkContext = ... // Load documents (one per line). val do

在尝试编译上述代码段时,我遇到以下错误

import org.apache.spark.rdd.RDD
import org.apache.spark.SparkContext
import org.apache.spark.mllib.feature.HashingTF
import org.apache.spark.mllib.linalg.Vector

val sc: SparkContext = ...

// Load documents (one per line).
val documents: RDD[Seq[String]] = sc.textFile("...").map(_.split(" ").toSeq)

val hashingTF = new HashingTF()
val tf: RDD[Vector] = hashingTF.transform(documents)
我在build.sbt文件中添加了以下行

[error] /siva/test/src/main/scala/com/chimpler/sparknaivebayesreuters/Tokenizer.scala:10: object feature is not a member of package org.apache.spark.mllib
[error] import org.apache.spark.mllib.feature.HashingTF
[error]                               ^
[error] /siva/test/src/main/scala/com/chimpler/sparknaivebayesreuters/Tokenizer.scala:36: not found: type HashingTF
[error] val hashingTF = new HashingTF()
[error]                     ^
[error] /siva/test/src/main/scala/com/chimpler/sparknaivebayesreuters/Tokenizer.scala:37: not found: value hasingTF
[error] val tf: RDD[Vector] = hasingTF.transform(documents)
[error]                       ^
[error] three errors found
[error] (compile:compile) Compilation failed
[error] Total time: 14 s, completed 3 Nov, 2014 1:57:31 PM

有指针吗?

我使用了错误版本的mllib。修改libraryDependencies以spark mllib 1.1.0修复了它

   libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core"              % "1.0.2" % "provided",
  "org.apache.spark" %% "spark-mllib"             % "1.0.2" % "provided")
//  "org.apache.spark" %% "spark-streaming"         % "1.0.0" % "provided")