Warning: file_get_contents(/data/phpspider/zhask/data//catemap/7/elixir/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala 错误:无法找到或加载主类com.sundogsoftware.spark.ratingsconte_Scala_Scala Ide - Fatal编程技术网

Scala 错误:无法找到或加载主类com.sundogsoftware.spark.ratingsconte

Scala 错误:无法找到或加载主类com.sundogsoftware.spark.ratingsconte,scala,scala-ide,Scala,Scala Ide,我不知道这里怎么了。当我运行时,在我的scala IDE中不断出现“错误:找不到或加载主类com.sundogsoftware.spark.ratingsconter” 这是我的scala代码 package com.sundogsoftware.spark import org.apache.spark._ import org.apache.spark.SparkContext._ import org.apache.log4j._ /** Count up how many of ea

我不知道这里怎么了。当我运行时,在我的scala IDE中不断出现“错误:找不到或加载主类com.sundogsoftware.spark.ratingsconter”

这是我的scala代码

package com.sundogsoftware.spark

import org.apache.spark._
import org.apache.spark.SparkContext._
import org.apache.log4j._

/** Count up how many of each star rating exists in the MovieLens 100K 
data set. */
object RatingsCounter {

/** Our main function where the action happens */
def main(args: Array[String]) {

// Set the log level to only print errors
Logger.getLogger("org").setLevel(Level.ERROR)

// Create a SparkContext using every core of the local machine, named RatingsCounter
val sc = new SparkContext("local[*]", "RatingsCounter")

// Load up each line of the ratings data into an RDD
val lines = sc.textFile("../ml-100k/u.data")

// Convert each line to a string, split it out by tabs, and extract the third field.
// (The file format is userID, movieID, rating, timestamp)
val ratings = lines.map(x => x.toString().split("\t")(2))

// Count up how many times each value (rating) occurs
val results = ratings.countByValue()

// Sort the resulting map of (rating, count) tuples
val sortedResults = results.toSeq.sortBy(_._1)

// Print each result on its own line.
sortedResults.foreach(println)
 }
}
这是我的项目结构

这是我的跑步配置

这是我选择的scala编译器选项

现在试着调试几个小时,似乎什么都没用

任何指针都会有帮助。

检查一下,我必须更改eclipse.ini文件中的vm arg,对于我的JRE选项,我在创建scala项目时选择了“使用默认JRE(当前为“Java SE 8[1.8.0172]”)。为我修正了这个错误

我正在使用OSX,所以我必须添加

-vm
/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/bin/java

上图-vmargs

您如何构建和执行您的工件?你使用Spark 1.6还有什么特别的原因吗?我正在努力学习Spark scala,报名参加我的Udemy课程。这是第一个在课堂上教授的scala程序。尝试在Scala IDE中运行它。我不断收到“错误:无法找到或加载主类com.sundogsoftware.spark.ratingsconter”。这就是讲师所做的所有步骤,并且对他有效(他使用windows机器,我使用MAc)