Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark 在独立群集上运行spark shell时出现过大的帧错误_Apache Spark - Fatal编程技术网

Apache spark 在独立群集上运行spark shell时出现过大的帧错误

Apache spark 在独立群集上运行spark shell时出现过大的帧错误,apache-spark,Apache Spark,针对本地运行的spark单机群集启动spark shell时遇到问题。有什么想法吗? 我在spark 3.1.0-SNAPSHOT上运行这个 启动shell或常规应用程序在本地模式下工作正常,但使用以下命令都会失败 $ ./bin/spark-shell --master spark://localhost:8080 ... 20/04/05 00:34:47 WARN StandaloneAppClient$ClientEndpoint: Could not connect to localh

针对本地运行的spark单机群集启动spark shell时遇到问题。有什么想法吗? 我在spark 3.1.0-SNAPSHOT上运行这个

启动shell或常规应用程序在本地模式下工作正常,但使用以下命令都会失败

$ ./bin/spark-shell --master spark://localhost:8080
...
20/04/05 00:34:47 WARN StandaloneAppClient$ClientEndpoint: Could not connect to localhost:8080: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
20/04/05 00:34:47 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master localhost:8080
org.apache.spark.SparkException: Exception thrown in awaitResult: 
        at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:303)
        at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
        at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101)
        at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109)
        at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anon$1.run(StandaloneAppClient.scala:106)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
        at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
        at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
        at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
        at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
        at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)
        at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
        at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
        at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        ... 1 more

问题是使用的端口不正确

该行出现在独立主日志中:

20/04/05 18:20:25 INFO Master: Starting Spark master at spark://localhost:7077
端口8080用于主UI。 正确的命令是:

$ ./bin/spark-shell --master spark://localhost:7077