在python程序上运行spark submit时出错

在python程序上运行spark submit时出错,python,apache-spark,Python,Apache Spark,我目前正在学习ApacheSpark并尝试运行一些示例python程序。目前,我得到以下例外 spark-submit friends-by-age.py WARNING: An illegal reflective access operation has occurred WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/usr/local/Cellar/apache-spark/3.

我目前正在学习ApacheSpark并尝试运行一些示例python程序。目前,我得到以下例外

spark-submit friends-by-age.py 
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/usr/local/Cellar/apache-spark/3.0.0/libexec/jars/spark-unsafe_2.12-3.0.0.jar) to constructor java.nio.DirectByteBuffer(long,int)
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/context.py:220: DeprecationWarning: Support for Python 2 and Python 3 prior to version 3.6 is deprecated as of Spark 3.0. See also the plan for dropping Python 2 support at https://spark.apache.org/news/plan-for-dropping-python-2-support.html.
  DeprecationWarning)
/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/shuffle.py:60: UserWarning: Please install psutil to have better support with spilling
20/08/17 21:52:43 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
org.apache.spark.api.python.PythonException: Traceback (most recent call last):
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/worker.py", line 605, in main
    process()
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/worker.py", line 595, in process
    out_iter = func(split_index, iterator)
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/rdd.py", line 2596, in pipeline_func
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/rdd.py", line 2596, in pipeline_func
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/rdd.py", line 425, in func
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/rdd.py", line 1946, in combineLocally
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/shuffle.py", line 252, in mergeValues
    if get_used_memory() >= limit:
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/shuffle.py", line 64, in get_used_memory
    rss = resource.getrusage(resource.RUSAGE_SELF).ru_maxrss
AttributeError: 'module' object has no attribute 'getrusage'

    at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.handlePythonException(PythonRunner.scala:503)
    at org.apache.spark.api.python.PythonRunner$$anon$3.read(PythonRunner.scala:638)
    at org.apache.spark.api.python.PythonRunner$$anon$3.read(PythonRunner.scala:621)
    at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.hasNext(PythonRunner.scala:456)
    at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
    at scala.collection.Iterator$GroupedIterator.fill(Iterator.scala:1209)
    at scala.collection.Iterator$GroupedIterator.hasNext(Iterator.scala:1215)
    at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
    at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:132)
    at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
    at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
    at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:52)
    at org.apache.spark.scheduler.Task.run(Task.scala:127)
    at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:444)
    at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:447)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)
20/08/17 21:52:43 ERROR TaskSetManager: Task 0 in stage 0.0 failed 1 times; aborting job
Traceback (most recent call last):
  File "/Users/srikanthroopa/Documents/SourceCode/ApacheSpark/friends-by-age.py", line 16, in <module>
    results = averagesByAge.collect()
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/rdd.py", line 889, in collect
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/py4j-0.10.9-src.zip/py4j/java_gateway.py", line 1305, in __call__
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/py4j-0.10.9-src.zip/py4j/protocol.py", line 328, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.collectAndServe.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, srikanths-mbp.fritz.box, executor driver): org.apache.spark.api.python.PythonException: Traceback (most recent call last):
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/worker.py", line 605, in main
    process()
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/worker.py", line 595, in process
    out_iter = func(split_index, iterator)
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/rdd.py", line 2596, in pipeline_func
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/rdd.py", line 2596, in pipeline_func
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/rdd.py", line 425, in func
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/rdd.py", line 1946, in combineLocally
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/shuffle.py", line 252, in mergeValues
    if get_used_memory() >= limit:
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/shuffle.py", line 64, in get_used_memory
    rss = resource.getrusage(resource.RUSAGE_SELF).ru_maxrss
AttributeError: 'module' object has no attribute 'getrusage'

    at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.handlePythonException(PythonRunner.scala:503)
    at org.apache.spark.api.python.PythonRunner$$anon$3.read(PythonRunner.scala:638)
    at org.apache.spark.api.python.PythonRunner$$anon$3.read(PythonRunner.scala:621)
    at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.hasNext(PythonRunner.scala:456)
    at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
    at scala.collection.Iterator$GroupedIterator.fill(Iterator.scala:1209)
    at scala.collection.Iterator$GroupedIterator.hasNext(Iterator.scala:1215)
    at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
    at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:132)
    at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
    at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
    at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:52)
    at org.apache.spark.scheduler.Task.run(Task.scala:127)
    at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:444)
    at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:447)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)

Driver stacktrace:
    at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2023)
    at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:1972)
    at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:1971)
    at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
    at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
    at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1971)
    at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:950)
    at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:950)
    at scala.Option.foreach(Option.scala:407)
    at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:950)
    at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2203)
    at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2152)
    at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2141)
    at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
    at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:752)
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:2093)
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:2114)
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:2133)
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:2158)
    at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1004)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
    at org.apache.spark.rdd.RDD.withScope(RDD.scala:388)
    at org.apache.spark.rdd.RDD.collect(RDD.scala:1003)
    at org.apache.spark.api.python.PythonRDD$.collectAndServe(PythonRDD.scala:168)
    at org.apache.spark.api.python.PythonRDD.collectAndServe(PythonRDD.scala)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.base/java.lang.reflect.Method.invoke(Method.java:566)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
    at py4j.Gateway.invoke(Gateway.java:282)
    at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
    at py4j.commands.CallCommand.execute(CallCommand.java:79)
    at py4j.GatewayConnection.run(GatewayConnection.java:238)
    at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: org.apache.spark.api.python.PythonException: Traceback (most recent call last):
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/worker.py", line 605, in main
    process()
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/worker.py", line 595, in process
    out_iter = func(split_index, iterator)
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/rdd.py", line 2596, in pipeline_func
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/rdd.py", line 2596, in pipeline_func
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/rdd.py", line 425, in func
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/rdd.py", line 1946, in combineLocally
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/shuffle.py", line 252, in mergeValues
    if get_used_memory() >= limit:
  File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/shuffle.py", line 64, in get_used_memory
    rss = resource.getrusage(resource.RUSAGE_SELF).ru_maxrss
AttributeError: 'module' object has no attribute 'getrusage'

    at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.handlePythonException(PythonRunner.scala:503)
    at org.apache.spark.api.python.PythonRunner$$anon$3.read(PythonRunner.scala:638)
    at org.apache.spark.api.python.PythonRunner$$anon$3.read(PythonRunner.scala:621)
    at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.hasNext(PythonRunner.scala:456)
    at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
    at scala.collection.Iterator$GroupedIterator.fill(Iterator.scala:1209)
    at scala.collection.Iterator$GroupedIterator.hasNext(Iterator.scala:1215)
    at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
    at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:132)
    at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
    at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
    at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:52)
    at org.apache.spark.scheduler.Task.run(Task.scala:127)
    at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:444)
    at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:447)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
spark按年龄提交好友.py
警告:发生了非法的反射访问操作
警告:org.apache.spark.unsafe.Platform(文件:/usr/local/ceral/apache spark/3.0.0/libexec/jars/spark-unsafe_2.12-3.0.0.jar)对构造函数java.nio.DirectByteBuffer(long,int)的非法反射访问
警告:请考虑将此报告给Or.ApHEC.SPARK.UNSAFE平台的维护者。
警告:使用--invalize access=warn以启用对进一步非法访问操作的警告
警告:所有非法访问操作将在未来版本中被拒绝
/usr/local/ceral/apache spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/context.py:220:DeprecationWarning:3.6之前版本的python 2和python 3的支持从spark 3.0开始就不受欢迎。另请参见在上放弃Python 2支持的计划https://spark.apache.org/news/plan-for-dropping-python-2-support.html.
弃用警告)
/usr/local/ceral/apache spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/shuffle.py:60:UserWarning:请安装psutil以更好地支持溢出
2017年8月20日21:52:43错误执行者:任务0.0在阶段0.0中出现异常(TID 0)
org.apache.spark.api.python.PythonException:回溯(最近一次调用last):
文件“/usr/local/ceral/apache spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/worker.py”,第605行,在main中
过程()
文件“/usr/local/ceral/apache spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/worker.py”,第595行,正在处理中
out\u iter=func(拆分索引,迭代器)
文件“/usr/local/ceral/apache spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/rdd.py”,第2596行,在管道功能中
文件“/usr/local/ceral/apache spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/rdd.py”,第2596行,在管道功能中
文件“/usr/local/ceral/apache spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/rdd.py”,第425行,在func中
文件“/usr/local/ceral/apachespark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/rdd.py”,第1946行,以组合形式
文件“/usr/local/ceral/apache spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/shuffle.py”,第252行,合并值
如果get_used_memory()>=限制:
文件“/usr/local/ceral/apache spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/shuffle.py”,第64行,在get_used_内存中
rss=resource.getrusage(resource.RUSAGE\u SELF).ru\u maxrss
AttributeError:“模块”对象没有属性“getrusage”
位于org.apache.spark.api.python.BasePythonRunner$readeriator.handlePythonException(PythonRunner.scala:503)
位于org.apache.spark.api.python.PythonRunner$$anon$3.read(PythonRunner.scala:638)
位于org.apache.spark.api.python.PythonRunner$$anon$3.read(PythonRunner.scala:621)
位于org.apache.spark.api.python.BasePythonRunner$readerierator.hasNext(PythonRunner.scala:456)
在org.apache.spark.interruptblediator.hasNext(interruptblediator.scala:37)
位于scala.collection.Iterator$GroupedIterator.fill(Iterator.scala:1209)
位于scala.collection.Iterator$GroupedIterator.hasNext(Iterator.scala:1215)
位于scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
位于org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:132)
位于org.apache.spark.shuffle.shufflewWriteProcessor.write(shufflewWriteProcessor.scala:59)
在org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)上
在org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:52)上
位于org.apache.spark.scheduler.Task.run(Task.scala:127)
在org.apache.spark.executor.executor$TaskRunner.$anonfun$run$3(executor.scala:444)
位于org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
位于org.apache.spark.executor.executor$TaskRunner.run(executor.scala:447)
位于java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
位于java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
位于java.base/java.lang.Thread.run(Thread.java:834)
2017年8月20日21:52:43错误TaskSetManager:阶段0.0中的任务0失败1次;中止工作
回溯(最近一次呼叫最后一次):
文件“/Users/srikanthroopa/Documents/SourceCode/ApacheSpark/friends by age.py”,第16行,在
结果=averagesbage.collect()
文件“/usr/local/cillar/apachespark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/rdd.py”,第889行,在collect中
文件“/usr/local/ceral/apache spark/3.0.0/libexec/python/lib/py4j-0.10.9-src.zip/py4j/java_gateway.py”,第1305行,在调用__
文件“/usr/local/ceral/apache spark/3.0.0/libexec/python/lib/py4j-0.10.9-src.zip/py4j/protocol.py”,第328行,在get_return_值中
py4j.protocol.Py4JJavaError:调用z:org.apache.spark.api.python.PythonRDD.collectAndServe时出错。
:org.apache.spark.SparkException:作业因阶段失败而中止:阶段0.0中的任务0失败1次,最近的失败:阶段0.0中的任务0.0丢失(TID 0,srikanths-mbp.fritz.box,执行器驱动程序):org.apache.spark.api.python.PythonException:回溯(最近一次调用):
文件“/usr/local/ceral/apache spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/worker.py”,第605行,在main中
过程()
文件“/usr/local/ceral/apache spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/worker.py”,第595行,正在处理中
out\u iter=func(拆分索引,迭代器)
文件“/usr/local/ceral/apache spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/rdd.py”,第2596行,在管道功能中
文件“/usr/local/ceral/apache spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/rdd.py”,第2596行,在管道功能中
文件“/usr/local/ceral/apache spark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/rdd.py”,第425行,在func中
文件“/usr/local/ceral/apachespark/3.0.0/libexec/python/lib/pyspark.zip/pyspark/rdd.py”,第1946行,以组合形式
菲尔