Spark Scala将输出写入文本文件
我正在spark中执行wordcount程序,并试图将结果存储在文本文件中 我有一个scala脚本,可以将单词计算为SparkWordCount.scala。我正试图从Spark控制台执行脚本,如下所示Spark Scala将输出写入文本文件,scala,apache-spark,Scala,Apache Spark,我正在spark中执行wordcount程序,并试图将结果存储在文本文件中 我有一个scala脚本,可以将单词计算为SparkWordCount.scala。我正试图从Spark控制台执行脚本,如下所示 scala> :load /opt/spark-2.0.2-bin-hadoop2.7/bin/SparkWordCount.scala Loading /opt/spark-2.0.2-bin-hadoop2.7/bin/SparkWordCount.scala... import or
scala> :load /opt/spark-2.0.2-bin-hadoop2.7/bin/SparkWordCount.scala
Loading /opt/spark-2.0.2-bin-hadoop2.7/bin/SparkWordCount.scala...
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark._
defined object SparkWordCount
scala>
程序执行后,我得到消息“defined object SparkWordCount”,但我无法在文本文件中看到输出结果
下面是我的字数计算程序
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark._
object SparkWordCount {
def main(args: Array[String]) {
val sc = new SparkContext( "local", "Word Count", "/opt/spark-2.0.2-bin-hadoop2.7",/opt/spark-2.0.2-bin-hadoop2.7/jars,map())
val input = sc.textFile("demo.txt")
val count = input.flatMap(line ⇒ line.split(" ")).map(word ⇒ (word, 1)).reduceByKey(_ + _)
count.saveAsTextFile("outfile")
}
}
请任何人提出建议。谢谢。一旦定义了对象,您就可以调用该方法来执行代码。Spark shell不会自动执行主方法。在您的情况下,可以使用
SparkWordCount.main(Array())
执行字数计算程序