Scala 在Intellij 14.1.3中运行Spark应用程序

Scala 在Intellij 14.1.3中运行Spark应用程序,scala,apache-spark,intellij-14,Scala,Apache Spark,Intellij 14,我正在尝试运行Intellij14.1.3中用Scala编写的Spark应用程序。Scala sdk是Scala-sdk-2.11.6。我在执行代码时出现以下错误: Exception in thread "main" java.lang.NoSuchMethodError: scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet; at akka.actor.ActorCell$.<in

我正在尝试运行Intellij14.1.3中用Scala编写的Spark应用程序。Scala sdk是Scala-sdk-2.11.6。我在执行代码时出现以下错误:

Exception in thread "main" java.lang.NoSuchMethodError: scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet;
at akka.actor.ActorCell$.<init>(ActorCell.scala:336)
at akka.actor.ActorCell$.<clinit>(ActorCell.scala)
at akka.actor.RootActorPath.$div(ActorPath.scala:159)
at akka.actor.LocalActorRefProvider.<init>(ActorRefProvider.scala:464)
at akka.remote.RemoteActorRefProvider.<init>(RemoteActorRefProvider.scala:124)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
at scala.util.Try$.apply(Try.scala:191)
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
at scala.util.Success.flatMap(Try.scala:230)
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
at akka.actor.ActorSystemImpl.liftedTree1$1(ActorSystem.scala:584)
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:577)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:141)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:118)
at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:122)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:55)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1837)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:166)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1828)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:57)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:223)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:269)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:272)
at LRParquetProcess$.main(LRParquetProcess.scala:9)
at LRParquetProcess.main(LRParquetProcess.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)
线程“main”java.lang.NoSuchMethodError中出现异常:scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet; 在akka.actor.ActorCell$(ActorCell.scala:336) 位于akka.actor.ActorCell$(ActorCell.scala) 在akka.actor.RootActorPath.$div(ActorPath.scala:159) 在akka.actor.LocalActorRefProvider。(ActorRefProvider.scala:464) 在akka.remote.RemoteActorRefProvider。(RemoteActorRefProvider.scala:124) 位于sun.reflect.NativeConstructorAccessorImpl.newInstance0(本机方法) 位于sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 在sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 位于java.lang.reflect.Constructor.newInstance(Constructor.java:422) 在akka.actor.reflectedDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78) 在scala.util.Try$.apply(Try.scala:191) 在akka.actor.reflectedDynamicAccess.createInstanceFor(DynamicAccess.scala:73) 在akka.actor.reflectDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)中 在akka.actor.reflectDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)中 在scala.util.Success.flatMap(Try.scala:230) 在akka.actor.reflectDynamicAccess.createInstanceFor(DynamicAccess.scala:84) 在akka.actor.ActorSystemImpl.liftedTree1$1(ActorSystem.scala:584) 在akka.actor.ActorSystemImpl.(ActorSystem.scala:577) 在akka.actor.ActorSystem$.apply上(ActorSystem.scala:141) 在akka.actor.ActorSystem$.apply上(ActorSystem.scala:118) 位于org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:122) 位于org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:55) 位于org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54) 位于org.apache.spark.util.Utils$$anonfun$startServicePort$1.apply$mcVI$sp(Utils.scala:1837) 位于scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:166) 位于org.apache.spark.util.Utils$.startServicePort(Utils.scala:1828) 位于org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:57) 位于org.apache.spark.SparkEnv$.create(SparkEnv.scala:223) 在org.apache.spark.SparkEnv$.createDriverEnv上(SparkEnv.scala:163) 位于org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:269) 位于org.apache.spark.SparkContext(SparkContext.scala:272) 在LRParquetProcess$.main(LRParquetProcess.scala:9) 位于LRParquetProcess.main(LRParquetProcess.scala) 在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处 位于sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)中 位于java.lang.reflect.Method.invoke(Method.java:497) 位于com.intellij.rt.execution.application.AppMain.main(AppMain.java:140) 进程已完成,退出代码为1

My pom.xml如下所示:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
     xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
     xsi:schemaLocation="http://maven.apache.org/POM/4.0.0   http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>ParquetGeneration</groupId>
<artifactId>ParquetGeneration</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<hadoop.version>2.7.0</hadoop.version>
</properties>
<dependencies>
<dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>1.3.1</version>
        <exclusions>
            <exclusion>
                <groupId>org.apache.hadoop</groupId>
                <artifactId>hadoop-client</artifactId>
            </exclusion>
        </exclusions>
    </dependency>
    <dependency>
    <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-hdfs</artifactId>
        <version>${hadoop.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>${hadoop.version}</version>
        <exclusions>
            <exclusion>
                <groupId>org.eclipse.jetty</groupId>
                <artifactId>*</artifactId>
            </exclusion>
        </exclusions>
    </dependency>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-mapreduce-client-app</artifactId>
        <version>${hadoop.version}</version>
    </dependency>
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
        <version>2.10.5</version>
    </dependency>
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-compiler</artifactId>
        <version>2.10.5</version>
    </dependency>


    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.10</artifactId>
        <version>1.2.1</version>
    </dependency>
    <dependency>
        <groupId>com.typesafe.akka</groupId>
        <artifactId>akka-actor_2.10</artifactId>
        <version>2.3.11</version>
    </dependency>

    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-hive_2.10</artifactId>
        <version>1.3.1</version>
    </dependency>

</dependencies>

4.0.0
拼花房
拼花房
1.0-快照
2.7.0
org.apache.spark
spark-core_2.10
1.3.1
org.apache.hadoop
hadoop客户端
org.apache.hadoop
hadoop hdfs
${hadoop.version}
org.apache.hadoop
hadoop通用
${hadoop.version}
org.eclipse.jetty
*
org.apache.hadoop
hadoop mapreduce客户端应用程序
${hadoop.version}
org.scala-lang
scala图书馆
2.10.5
org.scala-lang
scala编译器
2.10.5
org.apache.spark
spark-sql_2.10
1.2.1
com.typesafe.akka
akka-actor_2.10
2.3.11
org.apache.spark
spark-hive_2.10
1.3.1

转到scala 2.10,目前会更好

建议您尝试2.10.x

安装2.10.x并设置相关的环境变量以使用它。由于您已经有一个项目,请转到文件->项目结构->全局库并删除2.11.x。然后按“+”->Scala SDK->Browse添加2.10.x,并选择之前安装的2.10.x文件夹


scala版本要求在中指定。

Spark与scala 2.11不兼容。谢谢Thomas。你能告诉我怎么升到2.10吗?这将非常有帮助。同时,我得到了差异错误,但这是可行的,我知道我的应用程序在scala 2.11上工作,但我的环境是2.13。谢谢,脖子很痛。错误是scala.collection.mutable.Set$.apply.bla