Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark 试图在本地模型中提交spark应用程序,出现以下错误'';无法从JAR加载主类";_Apache Spark_Pyspark_Apache Spark Sql_Spark Streaming_Apache Spark Mllib - Fatal编程技术网

Apache spark 试图在本地模型中提交spark应用程序,出现以下错误'';无法从JAR加载主类";

Apache spark 试图在本地模型中提交spark应用程序,出现以下错误'';无法从JAR加载主类";,apache-spark,pyspark,apache-spark-sql,spark-streaming,apache-spark-mllib,Apache Spark,Pyspark,Apache Spark Sql,Spark Streaming,Apache Spark Mllib,我试图提交一个火花申请在我的本地和我下面的错误 Exception in thread "main" org.apache.spark.SparkException: Cannot load main class from JAR file: at org.apache.spark.deploy.SparkSubmitArguments.error(SparkSubmitArguments.scala:657) at org.apache.spa

我试图提交一个火花申请在我的本地和我下面的错误

Exception in thread "main" org.apache.spark.SparkException: Cannot load main class from JAR file:        at org.apache.spark.deploy.SparkSubmitArguments.error(SparkSubmitArguments.scala:657)
        at org.apache.spark.deploy.SparkSubmitArguments.loadEnvironmentArguments(SparkSubmitArguments.scala:221)
        at org.apache.spark.deploy.SparkSubmitArguments.<init>(SparkSubmitArguments.scala:116)
        at org.apache.spark.deploy.SparkSubmit$$anon$2$$anon$1.<init>(SparkSubmit.scala:907)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.parseArguments(SparkSubmit.scala:907)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:81)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

您是否尝试将master传递给命令
spark submit--master local word count.py
Yes@undefined_变量,我尝试了spark submit--master local word-count.py,仍然得到了相同的错误,您需要使用--py files命令行参数来传递py文件。我从未在spark上使用python。另一方面,您是否构建了一个包含python文件的JAR文件?我尝试了--py files参数,仍然面临相同的错误,我没有为.py文件构建JAR
from pyspark.sql import SparkSession
from pyspark.sql.functions import explode
from pyspark.sql.functions import split

def main():
    sparkSession = SparkSession.builder.appName("Word Count").getOrCreate()
    sparkSession.sparkContext.setLogLevel("ERROR")
    readStream = sparkSession.readStream.format('text').load(path)

    print("-------------------------------------------------")  
    print("Streaming source ready: ", readStream.isStreaming)
    readStream.printSchema()

    words = readStream.select(explode(split(readStream.value,'  ')).alias('word'))
    wordCounts = words.groupBy('word').count().orderBy('count')

    query = wordCounts.writeStream.outputMode('complete').format('console').start().awaitTermination()

if __name__ == '__main__':
    main()