Warning: file_get_contents(/data/phpspider/zhask/data//catemap/1/angular/29.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java作业无法连接到独立的本地主机_Java_Apache Spark - Fatal编程技术网

Java作业无法连接到独立的本地主机

Java作业无法连接到独立的本地主机,java,apache-spark,Java,Apache Spark,我是spark新手,一直试图通过一个独立的本地主机运行我的第一个java spark作业。 现在我的master已经启动,一个worker也得到了注册,但是当在spark程序下面运行时,我得到了org.apache.spark.SparkException:结果中抛出的异常。 当master设置为local时,我的程序应该可以正常运行 我的火花代码: public static void main(String[] args) { //Setup configuration S

我是spark新手,一直试图通过一个独立的本地主机运行我的第一个java spark作业。 现在我的master已经启动,一个worker也得到了注册,但是当在spark程序下面运行时,我得到了org.apache.spark.SparkException:结果中抛出的异常。 当master设置为local时,我的程序应该可以正常运行

我的火花代码:

public static void main(String[] args) {

    //Setup configuration
    String appName = "My Very First Spark Job";
    //String sparkMaster = "local[2]";
    String sparkMaster = "spark://10.0.0.116:7077";

    JavaSparkContext spContext = null;

    SparkConf conf = new SparkConf()
            .setAppName(appName)
            .setMaster(sparkMaster);

    //Create Spark Context from configuration
    spContext = new JavaSparkContext(conf);
日志:

Spark Master:

    Jings-MBP-6:bin jingzhou$ ./spark-class org.apache.spark.deploy.master.Master
    Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
    17/11/28 20:55:11 INFO Master: Started daemon with process name: 12707@Jings-MBP-6.gateway
    17/11/28 20:55:11 INFO SignalUtils: Registered signal handler for TERM
    17/11/28 20:55:11 INFO SignalUtils: Registered signal handler for HUP
    17/11/28 20:55:11 INFO SignalUtils: Registered signal handler for INT
    17/11/28 20:55:11 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    17/11/28 20:55:11 INFO SecurityManager: Changing view acls to: jingzhou
    17/11/28 20:55:11 INFO SecurityManager: Changing modify acls to: jingzhou
    17/11/28 20:55:11 INFO SecurityManager: Changing view acls groups to: 
    17/11/28 20:55:11 INFO SecurityManager: Changing modify acls groups to: 
    17/11/28 20:55:11 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jingzhou); groups with view permissions: Set(); users  with modify permissions: Set(jingzhou); groups with modify permissions: Set()
    17/11/28 20:55:12 INFO Utils: Successfully started service 'sparkMaster' on port 7077.
    17/11/28 20:55:12 INFO Master: Starting Spark master at spark://10.0.0.116:7077
    17/11/28 20:55:12 INFO Master: Running Spark version 2.2.0
    17/11/28 20:55:12 INFO Utils: Successfully started service 'MasterUI' on port 8080.
    17/11/28 20:55:12 INFO MasterWebUI: Bound MasterWebUI to 0.0.0.0, and started at http://10.0.0.116:8080
    17/11/28 20:55:12 INFO Utils: Successfully started service on port 6066.
    17/11/28 20:55:12 INFO StandaloneRestServer: Started REST server for submitting applications on port 6066
    17/11/28 20:55:12 INFO Master: I have been elected leader! New state: ALIVE
    17/11/28 20:59:27 INFO Master: Registering worker 10.0.0.116:64461 with 8 cores, 15.0 GB RAM
    17/11/28 21:03:42 ERROR TransportRequestHandler: Error while invoking RpcHandler#receive() on RPC id 4722074090999773956
    java.io.EOFException
        at java.io.DataInputStream.readFully(DataInputStream.java:197)
        at java.io.DataInputStream.readUTF(DataInputStream.java:609)
        at java.io.DataInputStream.readUTF(DataInputStream.java:564)
        at org.apache.spark.rpc.netty.RequestMessage$.readRpcAddress(NettyRpcEnv.scala:582)
        at org.apache.spark.rpc.netty.RequestMessage$.apply(NettyRpcEnv.scala:592)
        at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:651)
        at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:636)
        at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:157)
        at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:105)
        at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
        at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
        at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
        at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
        at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
        at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
        at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
        at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
        at java.lang.Thread.run(Thread.java:748)
火花工:

    Jings-MBP-6:bin jingzhou$ ./spark-class org.apache.spark.deploy.worker.Worker spark://10.0.0.116:7077
    Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
    17/11/28 20:59:26 INFO Worker: Started daemon with process name: 12794@Jings-MBP-6.gateway
    17/11/28 20:59:26 INFO SignalUtils: Registered signal handler for TERM
    17/11/28 20:59:26 INFO SignalUtils: Registered signal handler for HUP
    17/11/28 20:59:26 INFO SignalUtils: Registered signal handler for INT
    17/11/28 20:59:26 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    17/11/28 20:59:26 INFO SecurityManager: Changing view acls to: jingzhou
    17/11/28 20:59:26 INFO SecurityManager: Changing modify acls to: jingzhou
    17/11/28 20:59:26 INFO SecurityManager: Changing view acls groups to: 
    17/11/28 20:59:26 INFO SecurityManager: Changing modify acls groups to: 
    17/11/28 20:59:26 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jingzhou); groups with view permissions: Set(); users  with modify permissions: Set(jingzhou); groups with modify permissions: Set()
    17/11/28 20:59:27 INFO Utils: Successfully started service 'sparkWorker' on port 64461.
    17/11/28 20:59:27 INFO Worker: Starting Spark worker 10.0.0.116:64461 with 8 cores, 15.0 GB RAM
    17/11/28 20:59:27 INFO Worker: Running Spark version 2.2.0
    17/11/28 20:59:27 INFO Worker: Spark home: /Users/jingzhou/Desktop/hadoop/spark/spark-2.2.0-bin-hadoop2.7
    17/11/28 20:59:27 INFO Utils: Successfully started service 'WorkerUI' on port 8081.
    17/11/28 20:59:27 INFO WorkerWebUI: Bound WorkerWebUI to 0.0.0.0, and started at http://10.0.0.116:8081
    17/11/28 20:59:27 INFO Worker: Connecting to master 10.0.0.116:7077...
    17/11/28 20:59:27 INFO TransportClientFactory: Successfully created connection to /10.0.0.116:7077 after 26 ms (0 ms spent in bootstraps)
    17/11/28 20:59:27 INFO Worker: Successfully registered with master spark://10.0.0.116:7077

在尝试从eclipse开发环境连接到本地集群时,我遇到了同样的问题

在我的例子中,这是与master和我使用maven依赖项的开发环境的版本不匹配

聚类结果为:2.2.1 我的开发版本是:2.1.0


更正版本后,错误得到解决。

您使用的是哪种spark版本?这可能是您问题的答案吗?可能是主机和客户端使用了不同的spark二进制文件。你检查过两边都一样吗?主版本:Spark 2.2.0为Hadoop 2.7.3构建客户端:2.0.0
    Jings-MBP-6:bin jingzhou$ ./spark-class org.apache.spark.deploy.worker.Worker spark://10.0.0.116:7077
    Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
    17/11/28 20:59:26 INFO Worker: Started daemon with process name: 12794@Jings-MBP-6.gateway
    17/11/28 20:59:26 INFO SignalUtils: Registered signal handler for TERM
    17/11/28 20:59:26 INFO SignalUtils: Registered signal handler for HUP
    17/11/28 20:59:26 INFO SignalUtils: Registered signal handler for INT
    17/11/28 20:59:26 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    17/11/28 20:59:26 INFO SecurityManager: Changing view acls to: jingzhou
    17/11/28 20:59:26 INFO SecurityManager: Changing modify acls to: jingzhou
    17/11/28 20:59:26 INFO SecurityManager: Changing view acls groups to: 
    17/11/28 20:59:26 INFO SecurityManager: Changing modify acls groups to: 
    17/11/28 20:59:26 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jingzhou); groups with view permissions: Set(); users  with modify permissions: Set(jingzhou); groups with modify permissions: Set()
    17/11/28 20:59:27 INFO Utils: Successfully started service 'sparkWorker' on port 64461.
    17/11/28 20:59:27 INFO Worker: Starting Spark worker 10.0.0.116:64461 with 8 cores, 15.0 GB RAM
    17/11/28 20:59:27 INFO Worker: Running Spark version 2.2.0
    17/11/28 20:59:27 INFO Worker: Spark home: /Users/jingzhou/Desktop/hadoop/spark/spark-2.2.0-bin-hadoop2.7
    17/11/28 20:59:27 INFO Utils: Successfully started service 'WorkerUI' on port 8081.
    17/11/28 20:59:27 INFO WorkerWebUI: Bound WorkerWebUI to 0.0.0.0, and started at http://10.0.0.116:8081
    17/11/28 20:59:27 INFO Worker: Connecting to master 10.0.0.116:7077...
    17/11/28 20:59:27 INFO TransportClientFactory: Successfully created connection to /10.0.0.116:7077 after 26 ms (0 ms spent in bootstraps)
    17/11/28 20:59:27 INFO Worker: Successfully registered with master spark://10.0.0.116:7077