Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/395.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 将作业提交到远程Apache Spark服务器_Java_Apache Spark_Apache Spark Sql - Fatal编程技术网

Java 将作业提交到远程Apache Spark服务器

Java 将作业提交到远程Apache Spark服务器,java,apache-spark,apache-spark-sql,Java,Apache Spark,Apache Spark Sql,ApacheSpark(v1.6.1)在Ubuntu(10.10.0.102)机器上作为服务启动,使用/start all.sh 现在需要使用JavaAPI远程将作业提交到此服务器 以下是从不同机器(10.10.0.95)运行的Java客户机代码 在运行应用程序>执行器摘要>stderr日志的Spark UI的服务器端,收到以下错误 Exception in thread "main" java.io.IOException: Failed to connect to /192.168.56.1

ApacheSpark(v1.6.1)在Ubuntu(10.10.0.102)机器上作为服务启动,使用
/start all.sh

现在需要使用JavaAPI远程将作业提交到此服务器

以下是从不同机器(10.10.0.95)运行的Java客户机代码

在运行应用程序>执行器摘要>stderr日志的Spark UI的服务器端,收到以下错误

Exception in thread "main" java.io.IOException: Failed to connect to /192.168.56.1:53112
    at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:216)
    at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:167)
    at org.apache.spark.rpc.netty.NettyRpcEnv.createClient(NettyRpcEnv.scala:200)
    at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:187)
    at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:183)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.net.ConnectException: Connection refused: /192.168.56.1:53112
    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
    at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:224)
    at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:289)
    at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:528)
    at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
    at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
    ... 1 more
但是没有任何IP地址配置为
192.168.56.1
。因此,是否缺少任何配置。

实际上,我的客户端计算机(10.10.0.95)是Windows计算机。当我尝试使用另一台Ubuntu机器(10.10.0.155)提交Spark作业时,我能够成功运行相同的Java客户端代码

当我在Windows客户端环境中调试时,当我提交spark作业时,会显示以下日志

INFO Remoting: Starting remoting
INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@192.168.56.1:61552]
INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 61552.
INFO MemoryStore: MemoryStore started with capacity 2.4 GB
INFO SparkEnv: Registering OutputCommitCoordinator
INFO Utils: Successfully started service 'SparkUI' on port 4044.
INFO SparkUI: Started SparkUI at http://192.168.56.1:4044
根据日志行编号2,其注册客户端为192.168.56.1

其他地方,在Ubuntu客户端

INFO Remoting: Starting remoting
INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@10.10.0.155:42786]
INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 42786.
INFO MemoryStore: MemoryStore started with capacity 511.1 MB
INFO SparkEnv: Registering OutputCommitCoordinator
INFO Utils: Successfully started service 'SparkUI' on port 4040.
INFO SparkUI: Started SparkUI at http://10.10.0.155:4040
根据日志行编号2,其注册客户端的IP地址为10.10.0.155,与实际IP地址相同

如果有人发现Windows客户端有什么问题,请告知社区

[更新]

我在虚拟箱中运行整个环境。Windows机器是我的主机,Ubuntu是来宾。和安装在Ubuntu机器上的Spark。在虚拟箱环境中虚拟箱安装以太网适配器虚拟箱主机专用网络,IPv4地址:
192.168.56.1
。并将此IP注册为客户端IP,而不是实际IP地址
10.10.0.95

INFO Remoting: Starting remoting
INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@192.168.56.1:61552]
INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 61552.
INFO MemoryStore: MemoryStore started with capacity 2.4 GB
INFO SparkEnv: Registering OutputCommitCoordinator
INFO Utils: Successfully started service 'SparkUI' on port 4044.
INFO SparkUI: Started SparkUI at http://192.168.56.1:4044
INFO Remoting: Starting remoting
INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@10.10.0.155:42786]
INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 42786.
INFO MemoryStore: MemoryStore started with capacity 511.1 MB
INFO SparkEnv: Registering OutputCommitCoordinator
INFO Utils: Successfully started service 'SparkUI' on port 4040.
INFO SparkUI: Started SparkUI at http://10.10.0.155:4040