Amazon web services 无法在EMR上运行Java Spark 我在本地运行了一个简单的JavaSpark程序,但在AmazonEMR上失败了。我尝试了AMI3.2.1和AMI3.3.1,都是相同的错误。失败的代码位于下面的JavaSparkContext: publicstaticvoidmain(字符串[]args)引发异常{ 如果(参数长度
我得到的错误是:Amazon web services 无法在EMR上运行Java Spark 我在本地运行了一个简单的JavaSpark程序,但在AmazonEMR上失败了。我尝试了AMI3.2.1和AMI3.3.1,都是相同的错误。失败的代码位于下面的JavaSparkContext: publicstaticvoidmain(字符串[]args)引发异常{ 如果(参数长度,amazon-web-services,apache-spark,emr,Amazon Web Services,Apache Spark,Emr,我得到的错误是: Exception in thread "main" java.lang.NoSuchMethodError: scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet; at akka.actor.ActorCell$.<init>(ActorCell.scala:305) at akka.actor.ActorCell$.&l
Exception in thread "main" java.lang.NoSuchMethodError: scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet;
at akka.actor.ActorCell$.<init>(ActorCell.scala:305)
at akka.actor.ActorCell$.<clinit>(ActorCell.scala)
at akka.actor.RootActorPath.$div(ActorPath.scala:152)
at akka.actor.LocalActorRefProvider.<init>(ActorRefProvider.scala:465)
at akka.remote.RemoteActorRefProvider.<init>(RemoteActorRefProvider.scala:124)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
at scala.util.Try$.apply(Try.scala:191)
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
at scala.util.Success.flatMap(Try.scala:230)
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:550)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:104)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:152)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:202)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:53)
at examples.JavaSparkProfile.main(JavaSparkProfile.java:54)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
线程“main”java.lang.NoSuchMethodError中出现异常:scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet;
在akka.actor.ActorCell$(ActorCell.scala:305)
位于akka.actor.ActorCell$(ActorCell.scala)
在akka.actor.RootActorPath.$div(ActorPath.scala:152)
在akka.actor.LocalActorRefProvider。(ActorRefProvider.scala:465)
在akka.remote.RemoteActorRefProvider。(RemoteActorRefProvider.scala:124)
位于sun.reflect.NativeConstructorAccessorImpl.newInstance0(本机方法)
位于sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
在sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
位于java.lang.reflect.Constructor.newInstance(Constructor.java:526)
在akka.actor.reflectedDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
在scala.util.Try$.apply(Try.scala:191)
在akka.actor.reflectedDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
在akka.actor.reflectDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)中
在akka.actor.reflectDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)中
在scala.util.Success.flatMap(Try.scala:230)
在akka.actor.reflectDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
在akka.actor.ActorSystemImpl.(ActorSystem.scala:550)
在akka.actor.ActorSystem$.apply上(ActorSystem.scala:111)
在akka.actor.ActorSystem$.apply上(ActorSystem.scala:104)
位于org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:104)
位于org.apache.spark.SparkEnv$.create(SparkEnv.scala:152)
位于org.apache.spark.SparkContext(SparkContext.scala:202)
位于org.apache.spark.api.java.JavaSparkContext(JavaSparkContext.scala:53)
位于examples.JavaSparkProfile.main(JavaSparkProfile.java:54)
在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处
在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)中
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)中
位于java.lang.reflect.Method.invoke(Method.java:606)
位于org.apache.hadoop.util.RunJar.main(RunJar.java:212)
有人知道吗
非常感谢!检查scala版本。选择2.10.你能在下面放几行试试吗?:String[]jars=newstring[]{spark job jar path};sparkConf.setJars(jars);
Exception in thread "main" java.lang.NoSuchMethodError: scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet;
at akka.actor.ActorCell$.<init>(ActorCell.scala:305)
at akka.actor.ActorCell$.<clinit>(ActorCell.scala)
at akka.actor.RootActorPath.$div(ActorPath.scala:152)
at akka.actor.LocalActorRefProvider.<init>(ActorRefProvider.scala:465)
at akka.remote.RemoteActorRefProvider.<init>(RemoteActorRefProvider.scala:124)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
at scala.util.Try$.apply(Try.scala:191)
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
at scala.util.Success.flatMap(Try.scala:230)
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:550)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:104)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:152)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:202)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:53)
at examples.JavaSparkProfile.main(JavaSparkProfile.java:54)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)