Java 无法连接到spark master:InvalidClassException:org.apache.spark.rpc.RpcEndpointRef;本地类不兼容 我在Linux机器上安装了Spark。版本是spark-1.6.2-bin-hadoop2.6.tgz 然后我使用./sbin/start-all.sh启动Spark

Java 无法连接到spark master:InvalidClassException:org.apache.spark.rpc.RpcEndpointRef;本地类不兼容 我在Linux机器上安装了Spark。版本是spark-1.6.2-bin-hadoop2.6.tgz 然后我使用./sbin/start-all.sh启动Spark,java,serialization,apache-spark,serialversionuid,Java,Serialization,Apache Spark,Serialversionuid,我尝试在Eclipse中运行示例JavaWordCount.java。 但总是失败。有人能帮我吗 Spark Master的版本是:欢迎使用Scala版本2.10.5(Java HotSpot(TM)服务器虚拟机,Java 1.8.0_101)访问版本1.6.2 Eclipse端Spark的版本为: 例外情况如下: 16/07/25 12:01:20 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark:// h

我尝试在Eclipse中运行示例JavaWordCount.java。 但总是失败。有人能帮我吗

  • Spark Master的版本是:欢迎使用Scala版本2.10.5(Java HotSpot(TM)服务器虚拟机,Java 1.8.0_101)访问版本1.6.2 Eclipse端Spark的版本为:

  • 例外情况如下:

    16/07/25 12:01:20 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark:// hostname:7077...
    16/07/25 12:01:20 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master hostname:7077
    org.apache.spark.SparkException: Exception thrown in awaitResult
        at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77)
        at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75)
        at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)
        at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
        at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
        at scala.PartialFunction$OrElse.apply(PartialFunction.scala:162)
        at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83)
        at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:88)
        at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:96)
        at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:109)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
    Caused by: java.lang.RuntimeException: java.io.InvalidClassException: org.apache.spark.rpc.RpcEndpointRef; local class incompatible: stream classdesc serialVersionUID = -1223633663228316618, local class serialVersionUID = 18257903091306170
        at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
        at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
        at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
        at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
        at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:109)
        at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:258)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
        at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:310)
        at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:257)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
        at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:256)
        at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:588)
        at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:570)
        at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:149)
        at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:102)
        at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:104)
        at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)
        at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
        at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
        at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    

    此问题是由版本不匹配引起的。
    我尝试安装hadoop并使用spark-assembly-1.6.2-hadoop2.6.0.jar,现在可以正常工作。

    本地类不兼容:
    :您在每一端使用两个不同版本的spark。使用相同的版本。这确实是一个Spark bug,但使用一致的版本可以修复它。@EJP您能给我更多详细信息吗?我应该使用哪个版本?在Spark机器上:./Spark shell,我可以看到以下信息欢迎使用版本1.6.2,使用Scala版本2.10.5(Java HotSpot(TM)Server VM,Java 1.8.0_101)您应该使用您可以使用的最新版本。我下载了两端的所有最新版本,但仍然不起作用,然后您没有正确部署它。