Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark Py4JJavaError:org.apache.spark.SparkException:在结果中引发异常_Apache Spark_Hadoop_Pyspark_Jupyter Notebook_Jupyter - Fatal编程技术网

Apache spark Py4JJavaError:org.apache.spark.SparkException:在结果中引发异常

Apache spark Py4JJavaError:org.apache.spark.SparkException:在结果中引发异常,apache-spark,hadoop,pyspark,jupyter-notebook,jupyter,Apache Spark,Hadoop,Pyspark,Jupyter Notebook,Jupyter,我确实在jupyter创建了笔记本 SPARK_MAJOR_VERSION=2 PYSPARK_DRIVER_PYTHON=jupyter PYSPARK_DRIVER_PYTHON_OPTS='notebook --ip=hadoop-edge-001 --no-browser --port=8888' pyspark --master yarn-client --driver-memory 25g --executor-memory 50g --num-executors 1

我确实在jupyter创建了笔记本

SPARK_MAJOR_VERSION=2 PYSPARK_DRIVER_PYTHON=jupyter PYSPARK_DRIVER_PYTHON_OPTS='notebook --ip=hadoop-edge-001 --no-browser --port=8888' pyspark     --master yarn-client     --driver-memory 25g --executor-memory 50g --num-executors 100     --conf "spark.executor.cores=10"     --conf "spark.ui.port=8072"     --conf "spark.driver.maxResultSize=0"     --conf "spark.serializer=org.apache.spark.serializer.KryoSerializer"     --conf "spark.kryoserializer.buffer.max=1024m"     --conf "spark.shuffle.service.enabled=true"     --conf "spark.dynamicAllocation.enabled=true"     --conf "spark.dynamicAllocation.minExecutors=2"     --conf "spark.dynamicAllocation.initialExecutors=100"
在笔记本中执行以下操作后:

import pyspark
sc = pyspark.SparkContext(aplicationName="aerobus")
没有返回错误:

Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: org.apache.spark.SparkException: Exception thrown in awaitResult
                at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77)
                at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75)
                at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
                at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
                at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
                at scala.PartialFunction$OrElse.apply(PartialFunction.scala:167)
                at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83)
                at org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend.requestTotalExecutors(CoarseGrainedSchedulerBackend.scala:512)
                at org.apache.spark.ExecutorAllocationManager.start(ExecutorAllocationManager.scala:236)
                at org.apache.spark.SparkContext$$anonfun$21.apply(SparkContext.scala:552)
                at org.apache.spark.SparkContext$$anonfun$21.apply(SparkContext.scala:552)
                at scala.Option.foreach(Option.scala:257)
                at org.apache.spark.SparkContext.<init>(SparkContext.scala:552)
                at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
                at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
                at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
                at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
                at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
                at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
                at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
                at py4j.Gateway.invoke(Gateway.java:236)
                at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
                at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
                at py4j.GatewayConnection.run(GatewayConnection.java:214)
                at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: Failed to send RPC 5088920142760340842 to /192.168.1.64:54215: java.nio.channels.ClosedChannelException
                at org.apache.spark.network.client.TransportClient$3.operationComplete(TransportClient.java:249)
                at org.apache.spark.network.client.TransportClient$3.operationComplete(TransportClient.java:233)
                at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:514)
                at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:488)
                at io.netty.util.concurrent.DefaultPromise.access$000(DefaultPromise.java:34)
                at io.netty.util.concurrent.DefaultPromise$1.run(DefaultPromise.java:438)
                at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:408)
                at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:455)
                at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
                at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
                ... 1 more
Caused by: java.nio.channels.ClosedChannelException
                at io.netty.channel.AbstractChannel$AbstractUnsafe.write(...)(Unknown Source)
Py4JJavaError:调用None.org.apache.spark.api.java.JavaSparkContext时出错。
:org.apache.spark.SparkException:结果中引发异常
在org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse上(RpcTimeout.scala:77)
在org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse上(RpcTimeout.scala:75)
在scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)中
在org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIftTimeout$1.applyOrElse(RpcTimeout.scala:59)
在org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIftTimeout$1.applyOrElse(RpcTimeout.scala:59)
在scala.PartialFunction$OrElse.apply中(PartialFunction.scala:167)
位于org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83)
位于org.apache.spark.scheduler.cluster.RoughGrainedSchedulerBackend.requestTotalExecutors(RoughGrainedSchedulerBackend.scala:512)
位于org.apache.spark.ExecutorAllocationManager.start(ExecutorAllocationManager.scala:236)
位于org.apache.spark.SparkContext$$anonfun$21.apply(SparkContext.scala:552)
位于org.apache.spark.SparkContext$$anonfun$21.apply(SparkContext.scala:552)
位于scala.Option.foreach(Option.scala:257)
位于org.apache.spark.SparkContext(SparkContext.scala:552)
位于org.apache.spark.api.java.JavaSparkContext(JavaSparkContext.scala:58)
位于sun.reflect.NativeConstructorAccessorImpl.newInstance0(本机方法)
位于sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
在sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
位于java.lang.reflect.Constructor.newInstance(Constructor.java:423)
位于py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
位于py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
在py4j.Gateway.invoke处(Gateway.java:236)
位于py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
在py4j.commands.ConstructorCommand.execute处(ConstructorCommand.java:69)
在py4j.GatewayConnection.run处(GatewayConnection.java:214)
运行(Thread.java:745)
原因:java.io.IOException:未能将RPC 50889201427606340842发送到/192.168.1.64:54215:java.nio.channels.ClosedChannelException
位于org.apache.spark.network.client.TransportClient$3.operationComplete(TransportClient.java:249)
位于org.apache.spark.network.client.TransportClient$3.operationComplete(TransportClient.java:233)
位于io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:514)
位于io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:488)
位于io.netty.util.concurrent.DefaultPromise.access$000(DefaultPromise.java:34)
位于io.netty.util.concurrent.DefaultPromise$1.run(DefaultPromise.java:438)
位于io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:408)
位于io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:455)
位于io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
位于io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
... 还有一个
原因:java.nio.channels.ClosedChannelException
在io.netty.channel.AbstractChannel$AbstractSafe.write(…)(未知源)
如何解决这个问题