Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Amazon web services Spark无法在具有公共IP的端口7077上绑定_Amazon Web Services_Apache Spark_Amazon Ec2 - Fatal编程技术网

Amazon web services Spark无法在具有公共IP的端口7077上绑定

Amazon web services Spark无法在具有公共IP的端口7077上绑定,amazon-web-services,apache-spark,amazon-ec2,Amazon Web Services,Apache Spark,Amazon Ec2,我已经在AWS上安装了spark。 当我尝试在AWS上执行时,它工作,但spark不工作,当我检查sparkMaster日志时,我看到下一个: Spark Command: /usr/lib/jvm/java-8-oracle/jre/bin/java -cp /home/ubuntu/spark/conf/:/home/ubuntu/spark/jars/* -Xmx1g org.apache.spark$ ======================================== Us

我已经在AWS上安装了spark。 当我尝试在AWS上执行时,它工作,但spark不工作,当我检查sparkMaster日志时,我看到下一个:

Spark Command: /usr/lib/jvm/java-8-oracle/jre/bin/java -cp /home/ubuntu/spark/conf/:/home/ubuntu/spark/jars/* -Xmx1g org.apache.spark$
========================================
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/09/12 09:40:18 INFO Master: Started daemon with process name: 5451@server1
16/09/12 09:40:18 INFO SignalUtils: Registered signal handler for TERM
16/09/12 09:40:18 INFO SignalUtils: Registered signal handler for HUP
16/09/12 09:40:18 INFO SignalUtils: Registered signal handler for INT
16/09/12 09:40:18 WARN MasterArguments: SPARK_MASTER_IP is deprecated, please use SPARK_MASTER_HOST
16/09/12 09:40:19 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where a$
16/09/12 09:40:19 INFO SecurityManager: Changing view acls to: ubuntu
16/09/12 09:40:19 INFO SecurityManager: Changing modify acls to: ubuntu
16/09/12 09:40:19 INFO SecurityManager: Changing view acls groups to:
16/09/12 09:40:19 INFO SecurityManager: Changing modify acls groups to:
16/09/12 09:40:19 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set$
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7077. Attempting port 7078.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7078. Attempting port 7079.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7079. Attempting port 7080.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7080. Attempting port 7081.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7081. Attempting port 7082.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7082. Attempting port 7083.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7083. Attempting port 7084.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7084. Attempting port 7085.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7085. Attempting port 7086.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7086. Attempting port 7087.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7087. Attempting port 7088.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7088. Attempting port 7089.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7089. Attempting port 7090.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7090. Attempting port 7091.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7091. Attempting port 7092.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7092. Attempting port 7093.
Exception in thread "main" java.net.BindException: Cannot assign requested address: Service 'sparkMaster' failed after 16 retries! Co$
        at sun.nio.ch.Net.bind0(Native Method)
        at sun.nio.ch.Net.bind(Net.java:433)
        at sun.nio.ch.Net.bind(Net.java:425)
        at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
        at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
        at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
        at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
        at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
        at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
        at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
        at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
        at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
        at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
        at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
        at java.lang.Thread.run(Thread.java:745)
127.0.0.1  localhost

    52.211.60.97    server1
    52.210.246.199  client1
    52.211.71.126   client2
    52.211.20.213   client3

    # The following lines are desirable for IPv6 capable hosts
    ::1     ip6-localhost ip6-loopback
    fe00::0 ip6-localnet
    ff00::0 ip6-mcastprefix
    ff02::1 ip6-allnodes
    ff02::2 ip6-allrouters
My/etc/hosts是下一个:

Spark Command: /usr/lib/jvm/java-8-oracle/jre/bin/java -cp /home/ubuntu/spark/conf/:/home/ubuntu/spark/jars/* -Xmx1g org.apache.spark$
========================================
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/09/12 09:40:18 INFO Master: Started daemon with process name: 5451@server1
16/09/12 09:40:18 INFO SignalUtils: Registered signal handler for TERM
16/09/12 09:40:18 INFO SignalUtils: Registered signal handler for HUP
16/09/12 09:40:18 INFO SignalUtils: Registered signal handler for INT
16/09/12 09:40:18 WARN MasterArguments: SPARK_MASTER_IP is deprecated, please use SPARK_MASTER_HOST
16/09/12 09:40:19 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where a$
16/09/12 09:40:19 INFO SecurityManager: Changing view acls to: ubuntu
16/09/12 09:40:19 INFO SecurityManager: Changing modify acls to: ubuntu
16/09/12 09:40:19 INFO SecurityManager: Changing view acls groups to:
16/09/12 09:40:19 INFO SecurityManager: Changing modify acls groups to:
16/09/12 09:40:19 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set$
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7077. Attempting port 7078.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7078. Attempting port 7079.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7079. Attempting port 7080.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7080. Attempting port 7081.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7081. Attempting port 7082.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7082. Attempting port 7083.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7083. Attempting port 7084.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7084. Attempting port 7085.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7085. Attempting port 7086.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7086. Attempting port 7087.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7087. Attempting port 7088.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7088. Attempting port 7089.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7089. Attempting port 7090.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7090. Attempting port 7091.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7091. Attempting port 7092.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7092. Attempting port 7093.
Exception in thread "main" java.net.BindException: Cannot assign requested address: Service 'sparkMaster' failed after 16 retries! Co$
        at sun.nio.ch.Net.bind0(Native Method)
        at sun.nio.ch.Net.bind(Net.java:433)
        at sun.nio.ch.Net.bind(Net.java:425)
        at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
        at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
        at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
        at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
        at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
        at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
        at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
        at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
        at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
        at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
        at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
        at java.lang.Thread.run(Thread.java:745)
127.0.0.1  localhost

    52.211.60.97    server1
    52.210.246.199  client1
    52.211.71.126   client2
    52.211.20.213   client3

    # The following lines are desirable for IPv6 capable hosts
    ::1     ip6-localhost ip6-loopback
    fe00::0 ip6-localnet
    ff00::0 ip6-mcastprefix
    ff02::1 ip6-allnodes
    ff02::2 ip6-allrouters
这是我的spark-env.sh:

export SPARK_WORKER_MEMORY=512m
export SPARK_EXECUTOR_MEMORY=512m
export SPARK_WORKER_INSTANCES=1
export SPARK_WORKER_CORES=1
export SPARK_WORKER_DIR=/home/ubuntu/spark
export SPARK_LOCAL_IP=52.211.60.97
export SPARK_MASTER_IP=52.211.60.97
export SPARK_MASTER_WEBUI_PORT=4041

我也尝试过同样的剧本,但使用AWS VPC和私有实例和VPN,效果很好。所以我认为公共IP有任何问题,也许亚马逊阻止了公共IP上的一些端口?或者可能是什么问题?

我也面临着类似的问题。 这是因为spark主机无法打开spark_主机IP上指定的端口。 首先,通过
hostname
命令查找主机名。 然后,确保
/etc/hosts
中的机器ip地址映射到给定的主机名。 之后,将该主机名用于SPARK_MASTER_IP。 对于此问题,在群集模式下,您还可以提及导出SPARK\u LOCAL\u IP=127.0.0.1


PS:-我知道现在回复很晚,但可以帮助其他来这里的人。

你能粘贴你的安全组吗?我有进出所有流量、任何地点和所有端口的许可证@错误2007问题是如果我想在公共IP中打开Spark Master。实际上,它不是公共IP,而是连接在同一个专用网络上的四个节点。所以在这种情况下它不会有帮助吗?我的节点在专用网络中工作正常。问题是在公共IP中打开Spark,而不使用proxyI。在应用给定的更改后,我遇到了类似的问题和错误堆栈,它对我有效。如果你觉得它不相关,我可以把它删除。别担心。