Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/scala/16.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache Spark/Scala中一个不太常见的异常_Scala_Apache Spark - Fatal编程技术网

Apache Spark/Scala中一个不太常见的异常

Apache Spark/Scala中一个不太常见的异常,scala,apache-spark,Scala,Apache Spark,我在尝试进行spark提交时遇到以下问题: Exception in thread "main" java.lang.NullPointerException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect

我在尝试进行spark提交时遇到以下问题:

Exception in thread "main" java.lang.NullPointerException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
这是一个已知的问题吗

感谢和问候

以下程序能够再现上述错误消息:

import org.apache.spark.SparkContext
import org.apache.spark.SparkConf

class anomaly_model(val inputfile: String, val clusterNum: Int, val maxIterations: Int, val epsilon: Double, val scenarioNum: Int, val outputfile: String){
   val conf = new SparkConf().setAppName("Anomaly Model")
   val sc = new SparkContext(conf)
   val data = sc.textFile(inputfile)

def main(args: Array[String]) {

    val inputfile = "sqlexpt.txt"
    val clusterNum = 5 
    val maxIterations = 1000 
    val epsilon = 0.001
    val scenarioNum = 10
    val outputfile = "output.csv"
    val am = new anomaly_model(inputfile, clusterNum, maxIterations, epsilon, scenarioNum, outputfile)

  }
}

正如消息所说,您的一个变量中有一个空值。

在类中定义main方法可能是我的错误。。。我应该在一个伴生对象中定义它(我来自Java背景!)。

如果您想将主类放在模型类中,您可能需要将“类”更改为“对象”。是的,如果你想建立一个类的模型,你可能需要把main方法放在另一个对象中

在模型类中:

import org.apache.spark.SparkContext
import org.apache.spark.SparkConf

class anomaly_model(val inputfile: String, val clusterNum: Int, val maxIterations: Int, val epsilon: Double, val scenarioNum: Int, val outputfile: String){
    val conf = new SparkConf().setAppName("Anomaly Model")
    val sc = new SparkContext(conf)
    val data = sc.textFile(inputfile)
}
在Main.scala中:

import anomaly_model

object Main {
    def main(args: Array[String]) : Unit{
        val inputfile = "sqlexpt.txt"
        val clusterNum = 5 
        val maxIterations = 1000 
        val epsilon = 0.001
        val scenarioNum = 10
        val outputfile = "output.csv"
        val am = new anomaly_model(inputfile, clusterNum, maxIterations, epsilon, scenarioNum, outputfile)
    }
}

我和你有同样的问题,犯了同样的错误<代码>异常\u模型应该是
对象
而不是

您需要提供更多信息,最好用最少的示例说明问题。仅仅从堆栈跟踪中猜测几乎是不可能的