Scala Akka参与者到init spark上下文的问题

Scala Akka参与者到init spark上下文的问题,scala,apache-spark,Scala,Apache Spark,当我使用scala创建Spark上下文时,此跟踪显示: [sparkDriver-akka.actor.default-dispatcher-3] ERROR akka.actor.ActorSystemImpl - Uncaught fatal error from thread [sparkDriver-akka.remote.default-remote-dispatcher-5] shutting down ActorSystem [sparkDriver] java.

当我使用scala创建Spark上下文时,此跟踪显示:

    [sparkDriver-akka.actor.default-dispatcher-3] ERROR akka.actor.ActorSystemImpl - Uncaught fatal error from thread [sparkDriver-akka.remote.default-remote-dispatcher-5] shutting down ActorSystem [sparkDriver]

    java.lang.NoSuchMethodError: org.jboss.netty.channel.socket.nio.NioWorkerPool.<init>(Ljava/util/concurrent/Executor;I)V
    at akka.remote.transport.netty.NettyTransport.<init>(NettyTransport.scala:283)
    at akka.remote.transport.netty.NettyTransport.<init>(NettyTransport.scala:240)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
    at scala.util.Try$.apply(Try.scala:161)
    at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
    at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
    at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
    at scala.util.Success.flatMap(Try.scala:200)
    at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
    at akka.remote.EndpointManager$$anonfun$9.apply(Remoting.scala:692)
    at akka.remote.EndpointManager$$anonfun$9.apply(Remoting.scala:684)
    at scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722)
    at scala.collection.Iterator$class.foreach(Iterator.scala:727)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
    at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
    at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
    at scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721)
    at akka.remote.EndpointManager.akka$remote$EndpointManager$$listens(Remoting.scala:684)
    at akka.remote.EndpointManager$$anonfun$receive$2.applyOrElse(Remoting.scala:492)
    at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
    at akka.remote.EndpointManager.aroundReceive(Remoting.scala:395)
    at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
    at akka.actor.ActorCell.invoke(ActorCell.scala:487)
    at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
    at akka.dispatch.Mailbox.run(Mailbox.scala:220)
    at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
对不起,我不能再详细了,因为我完全不知道这个话题

如果有人知道这是怎么回事

已更新


我只是使用cassandra支持初始化spark上下文:

val sparkConf = new SparkConf().setAppName("QueryExample").setMaster("local[*]").set("spark.cassandr‌​a.connection.host", seeds).set("spark.cassandra.connection.rpc.port", "9171").set("spark.cassandra.connection.native.port","9142") sc = new SparkContext(sparkConf)

存在依赖性问题。Avro Tools jar文件正在导入到项目中,这导致了错误。谢谢大家。

也有类似的问题,但使用maven代替sbt。因为我将avro ipc作为我的一个依赖项,所以我需要排除netty,所以看起来是这样的

<dependency>
    <groupId>org.apache.avro</groupId>
    <artifactId>avro-ipc</artifactId>
    <version>${avro.version}</version>
    <exclusions>
        <exclusion>
            <groupId>io.netty</groupId>
            <artifactId>netty</artifactId>
        </exclusion>
    </exclusions>
</dependency>

org.apache.avro
avro ipc
${avro.version}
伊奥·内蒂
内蒂

您至少可以告诉我们您想做什么!正在共享一些可能导致错误的代码示例。我正在使用cassandra支持初始化spark上下文:val sparkConf=new sparkConf().setAppName(“QueryExample”).setMaster(“local[*]).set(“spark.cassandra.connection.host”,seeds).set(“spark.cassandra.connection.rpc.port”,“9171”).set(“spark.cassandra.connection.native.port”,“9142”)sc=new SparkContext(sparkConf)您需要用您的评论更新您的问题!您是如何解决这个问题的…不确定您所说的avro tools jar是什么意思被导入到项目中的…thx
<dependency>
    <groupId>org.apache.avro</groupId>
    <artifactId>avro-ipc</artifactId>
    <version>${avro.version}</version>
    <exclusions>
        <exclusion>
            <groupId>io.netty</groupId>
            <artifactId>netty</artifactId>
        </exclusion>
    </exclusions>
</dependency>