Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/scala/18.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala 在intellij内交互运行Spark:`akka.version`未找到_Scala_Intellij Idea_Apache Spark - Fatal编程技术网

Scala 在intellij内交互运行Spark:`akka.version`未找到

Scala 在intellij内交互运行Spark:`akka.version`未找到,scala,intellij-idea,apache-spark,Scala,Intellij Idea,Apache Spark,我试图在Intellij中的Scala工作表中运行Spark,但遇到一个错误,说找不到键“akka.version”的配置设置 工作表内容: 完整堆栈跟踪: 解决方法是使用 在spark项目中,在scala文件中按Ctrl+Shift+D或Cmd+Shift+D即可。粘贴代码并使用Ctrl+Enter或Cmd+Enter运行它 import org.apache.spark.SparkContext val sc1 = new SparkContext("local[8]", "sc1") i

我试图在Intellij中的Scala工作表中运行Spark,但遇到一个错误,说找不到键“akka.version”的配置设置

工作表内容:

完整堆栈跟踪:


解决方法是使用

在spark项目中,在scala文件中按Ctrl+Shift+D或Cmd+Shift+D即可。粘贴代码并使用Ctrl+Enter或Cmd+Enter运行它

import org.apache.spark.SparkContext
val sc1 = new SparkContext("local[8]", "sc1")
import org.apache.spark.SparkContext
15/01/06 16:30:32 INFO spark.SecurityManager: Changing view acls to: tobber
15/01/06 16:30:32 INFO spark.SecurityManager: Changing modify acls to: tobber
15/01/06 16:30:32 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tobber); users with modify permissions: Set(tobber)
com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'akka.version'
    at com.typesafe.config.impl.SimpleConfig.findKey(spark.sc0.tmp:111)
    at com.typesafe.config.impl.SimpleConfig.find(spark.sc0.tmp:132)
    at com.typesafe.config.impl.SimpleConfig.find(spark.sc0.tmp:138)
    at com.typesafe.config.impl.SimpleConfig.find(spark.sc0.tmp:146)
    at com.typesafe.config.impl.SimpleConfig.find(spark.sc0.tmp:151)
    at com.typesafe.config.impl.SimpleConfig.getString(spark.sc0.tmp:193)
    at akka.actor.ActorSystem$Settings.<init>(spark.sc0.tmp:132)
    at akka.actor.ActorSystemImpl.<init>(spark.sc0.tmp:466)
    at akka.actor.ActorSystem$.apply(spark.sc0.tmp:107)
    at akka.actor.ActorSystem$.apply(spark.sc0.tmp:100)
    at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(spark.sc0.tmp:117)
    at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(spark.sc0.tmp:50)
    at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(spark.sc0.tmp:49)
    at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(spark.sc0.tmp:1500)
    at scala.collection.immutable.Range.foreach$mVc$sp(spark.sc0.tmp:137)
    at org.apache.spark.util.Utils$.startServiceOnPort(spark.sc0.tmp:1491)
    at org.apache.spark.util.AkkaUtils$.createActorSystem(spark.sc0.tmp:52)
    at org.apache.spark.SparkEnv$.create(spark.sc0.tmp:149)
    at org.apache.spark.SparkContext.<init>(spark.sc0.tmp:200)
    at org.apache.spark.SparkContext.<init>(spark.sc0.tmp:115)
    at apps.A$A1$A$A1.sc$lzycompute(spark.sc0.tmp:2)
    at apps.A$A1$A$A1.sc(spark.sc0.tmp:2)
    at apps.A$A1$A$A1.get$$instance$$sc(spark.sc0.tmp:2)
    at #worksheet#.#worksheet#(spark.sc0.tmp:9)