Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 如何解决PySpark中的非法端口号?_Python_Apache Spark_Pyspark - Fatal编程技术网

Python 如何解决PySpark中的非法端口号?

Python 如何解决PySpark中的非法端口号?,python,apache-spark,pyspark,Python,Apache Spark,Pyspark,使用:OSX 10.11.2、SPARK 1.5.2版、Python 2.7.10版、iPython 4.0.1 打开iPython笔记本电脑终端的SPARK,我使用以下命令: IPYTHON=1 $SPARK_HOME/bin/pyspark 我的目标是并行化一个整数向量,然后对该向量应用count,first,以及take函数 我的工作流程如下(在IPython笔记本中已打开spark之后): 然后当我尝试这样运行count函数时 rdd.count() 我收到以下错误: (请注意,我意

使用:OSX 10.11.2、SPARK 1.5.2版、Python 2.7.10版、iPython 4.0.1

打开iPython笔记本电脑终端的SPARK,我使用以下命令:

IPYTHON=1 $SPARK_HOME/bin/pyspark
我的目标是并行化一个整数向量,然后对该向量应用
count
first
,以及
take
函数

我的工作流程如下(在IPython笔记本中已打开spark之后):

然后当我尝试这样运行count函数时

rdd.count()
我收到以下错误: (请注意,我意识到错误的来源是非法的端口号,但是我可以通过更改所使用的端口号来帮助解决此错误。我怀疑错误源来自iPython和Spark之间的交互,但是环顾四周,)

[3]中的
:rdd.count()
15/12/21 14:27:39信息SparkContext:开始作业:计数时间:1
15/12/21 14:27:39信息调度程序:已获取具有4个输出分区的作业1(计数:1)
15/12/21 14:27:39信息调度程序:最后阶段:结果阶段1(计数时间:1)
15/12/21 14:27:39信息调度程序:最终阶段的父级:列表()
15/12/21 14:27:39信息调度程序:缺少的家长:列表()
15/12/21 14:27:39信息调度程序:正在提交ResultStage 1(PythonRDD[1]在计数时为:1),其中没有丢失的父级
15/12/21 14:27:39信息内存存储:使用curMem=2001、maxMem=555755765调用EnsureRefreeSpace(4200)
15/12/21 14:27:39信息内存存储:块广播作为值存储在内存中(估计大小为4.1 KB,可用大小为530.0 MB)
15/12/21 14:27:39信息内存存储:使用curMem=6201、maxMem=555755765调用EnsureRefreeSpace(2713)
15/12/21 14:27:39信息内存存储:块广播\u 1\u片段0以字节形式存储在内存中(估计大小为2.6KB,可用大小为530.0MB)
15/12/21 14:27:39信息块管理信息:在本地主机上的内存中添加了广播片段0:64049(大小:2.6KB,可用:530.0MB)
15/12/21 14:27:39信息SparkContext:从DAGScheduler的广播创建了广播1。scala:861
15/12/21 14:27:39信息调度程序:从ResultStage 1提交4个缺少的任务(PythonRDD[1]在计数处:1)
15/12/21 14:27:39信息任务计划RIMPL:添加任务集1.0和4个任务
15/12/21 14:27:39信息任务集管理器:在阶段1.0中启动任务0.0(TID 4,localhost,PROCESS_LOCAL,2071字节)
15/12/21 14:27:39信息任务集管理器:在阶段1.0中启动任务1.0(TID 5,localhost,PROCESS_LOCAL,2090字节)
15/12/21 14:27:39信息任务集管理器:在阶段1.0中启动任务2.0(TID 6,localhost,PROCESS_LOCAL,2090字节)
15/12/21 14:27:39信息任务集管理器:在阶段1.0中启动任务3.0(TID 7,本地主机,进程\本地,2090字节)
15/12/21 14:27:39信息执行者:在阶段1.0(TID 4)中运行任务0.0
15/12/21 14:27:39信息执行者:在阶段1.0(TID 5)中运行任务1.0
15/12/21 14:27:39信息执行者:在阶段1.0(TID 6)中运行任务2.0
15/12/21 14:27:39信息执行者:在阶段1.0(TID 7)中运行任务3.0
15/12/21 14:27:39错误执行者:第1.0阶段任务2.0中的异常(TID 6)
java.lang.IllegalArgumentException:端口超出范围:1315905645
位于java.net.InetSocketAddress.checkPort(InetSocketAddress.java:143)
位于java.net.InetSocketAddress。(InetSocketAddress.java:188)
位于java.net.Socket(Socket.java:244)
位于org.apache.spark.api.python.PythonWorkerFactory.createSocket$1(PythonWorkerFactory.scala:75)
位于org.apache.spark.api.python.PythonWorkerFactory.liftedTree1$1(PythonWorkerFactory.scala:90)
位于org.apache.spark.api.PythonWorkerFactory.createThroughDaemon(PythonWorkerFactory.scala:89)
位于org.apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:62)
位于org.apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:135)
位于org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:101)
位于org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:70)
在org.apache.spark.rdd.rdd.computeOrReadCheckpoint(rdd.scala:300)上
位于org.apache.spark.rdd.rdd.iterator(rdd.scala:264)
位于org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
位于org.apache.spark.scheduler.Task.run(Task.scala:88)
位于org.apache.spark.executor.executor$TaskRunner.run(executor.scala:214)
位于java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
位于java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
运行(Thread.java:745)
15/12/21 14:27:39错误执行者:任务0.0在阶段1.0中出现异常(TID 4)
java.lang.IllegalArgumentException:端口超出范围:1315905645
位于java.net.InetSocketAddress.checkPort(InetSocketAddress.java:143)
位于java.net.InetSocketAddress。(InetSocketAddress.java:188)
位于java.net.Socket(Socket.java:244)
位于org.apache.spark.api.python.PythonWorkerFactory.createSocket$1(PythonWorkerFactory.scala:75)
位于org.apache.spark.api.python.PythonWorkerFactory.liftedTree1$1(PythonWorkerFactory.scala:90)
位于org.apache.spark.api.PythonWorkerFactory.createThroughDaemon(PythonWorkerFactory.scala:89)
位于org.apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:62)
位于org.apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:135)
位于org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:101)
位于org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:70)
在org.apache.spark.rdd.rdd.computeOrReadCheckpoint(rdd.scala:300)上
位于org.apache.spark.rdd.rdd.iterator(rdd.scala:264)
位于org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
位于org.apache.spark.scheduler.Task.run(Task.scala:88)
位于org.apache.spark.executor.executor$TaskRunner.run(executor.scala:214)
位于java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
位于java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
运行(Thread.java:745)
15/12/21 14:27:39错误执行者:异常i
rdd.count()
In [3]: rdd.count()
15/12/21 14:27:39 INFO SparkContext: Starting job: count at <ipython-input-3-a0443394e570>:1
15/12/21 14:27:39 INFO DAGScheduler: Got job 1 (count at <ipython-input-3-a0443394e570>:1) with 4 output partitions
15/12/21 14:27:39 INFO DAGScheduler: Final stage: ResultStage 1(count at <ipython-input-3-a0443394e570>:1)
15/12/21 14:27:39 INFO DAGScheduler: Parents of final stage: List()
15/12/21 14:27:39 INFO DAGScheduler: Missing parents: List()
15/12/21 14:27:39 INFO DAGScheduler: Submitting ResultStage 1 (PythonRDD[1] at count at <ipython-input-3-a0443394e570>:1), which has no missing parents
15/12/21 14:27:39 INFO MemoryStore: ensureFreeSpace(4200) called with curMem=2001, maxMem=555755765
15/12/21 14:27:39 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 4.1 KB, free 530.0 MB)
15/12/21 14:27:39 INFO MemoryStore: ensureFreeSpace(2713) called with curMem=6201, maxMem=555755765
15/12/21 14:27:39 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 2.6 KB, free 530.0 MB)
15/12/21 14:27:39 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on localhost:64049 (size: 2.6 KB, free: 530.0 MB)
15/12/21 14:27:39 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:861
15/12/21 14:27:39 INFO DAGScheduler: Submitting 4 missing tasks from ResultStage 1 (PythonRDD[1] at count at <ipython-input-3-a0443394e570>:1)
15/12/21 14:27:39 INFO TaskSchedulerImpl: Adding task set 1.0 with 4 tasks
15/12/21 14:27:39 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 4, localhost, PROCESS_LOCAL, 2071 bytes)
15/12/21 14:27:39 INFO TaskSetManager: Starting task 1.0 in stage 1.0 (TID 5, localhost, PROCESS_LOCAL, 2090 bytes)
15/12/21 14:27:39 INFO TaskSetManager: Starting task 2.0 in stage 1.0 (TID 6, localhost, PROCESS_LOCAL, 2090 bytes)
15/12/21 14:27:39 INFO TaskSetManager: Starting task 3.0 in stage 1.0 (TID 7, localhost, PROCESS_LOCAL, 2090 bytes)
15/12/21 14:27:39 INFO Executor: Running task 0.0 in stage 1.0 (TID 4)
15/12/21 14:27:39 INFO Executor: Running task 1.0 in stage 1.0 (TID 5)
15/12/21 14:27:39 INFO Executor: Running task 2.0 in stage 1.0 (TID 6)
15/12/21 14:27:39 INFO Executor: Running task 3.0 in stage 1.0 (TID 7)
15/12/21 14:27:39 ERROR Executor: Exception in task 2.0 in stage 1.0 (TID 6)
java.lang.IllegalArgumentException: port out of range:1315905645
    at java.net.InetSocketAddress.checkPort(InetSocketAddress.java:143)
    at java.net.InetSocketAddress.<init>(InetSocketAddress.java:188)
    at java.net.Socket.<init>(Socket.java:244)
    at org.apache.spark.api.python.PythonWorkerFactory.createSocket$1(PythonWorkerFactory.scala:75)
    at org.apache.spark.api.python.PythonWorkerFactory.liftedTree1$1(PythonWorkerFactory.scala:90)
    at org.apache.spark.api.python.PythonWorkerFactory.createThroughDaemon(PythonWorkerFactory.scala:89)
    at org.apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:62)
    at org.apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:135)
    at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:101)
    at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:70)
    at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300)
    at org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
    at org.apache.spark.scheduler.Task.run(Task.scala:88)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
15/12/21 14:27:39 ERROR Executor: Exception in task 0.0 in stage 1.0 (TID 4)
java.lang.IllegalArgumentException: port out of range:1315905645
    at java.net.InetSocketAddress.checkPort(InetSocketAddress.java:143)
    at java.net.InetSocketAddress.<init>(InetSocketAddress.java:188)
    at java.net.Socket.<init>(Socket.java:244)
    at org.apache.spark.api.python.PythonWorkerFactory.createSocket$1(PythonWorkerFactory.scala:75)
    at org.apache.spark.api.python.PythonWorkerFactory.liftedTree1$1(PythonWorkerFactory.scala:90)
    at org.apache.spark.api.python.PythonWorkerFactory.createThroughDaemon(PythonWorkerFactory.scala:89)
    at org.apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:62)
    at org.apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:135)
    at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:101)
    at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:70)
    at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300)
    at org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
    at org.apache.spark.scheduler.Task.run(Task.scala:88)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
15/12/21 14:27:39 ERROR Executor: Exception in task 3.0 in stage 1.0 (TID 7)
java.lang.IllegalArgumentException: port out of range:1315905645
    at java.net.InetSocketAddress.checkPort(InetSocketAddress.java:143)
    at java.net.InetSocketAddress.<init>(InetSocketAddress.java:188)
    at java.net.Socket.<init>(Socket.java:244)
    at org.apache.spark.api.python.PythonWorkerFactory.createSocket$1(PythonWorkerFactory.scala:75)
    at org.apache.spark.api.python.PythonWorkerFactory.liftedTree1$1(PythonWorkerFactory.scala:90)
    at org.apache.spark.api.python.PythonWorkerFactory.createThroughDaemon(PythonWorkerFactory.scala:89)
    at org.apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:62)
    at org.apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:135)
    at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:101)
    at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:70)
    at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300)
    at org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
    at org.apache.spark.scheduler.Task.run(Task.scala:88)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
15/12/21 14:27:39 ERROR Executor: Exception in task 1.0 in stage 1.0 (TID 5)
java.lang.IllegalArgumentException: port out of range:1315905645
    at java.net.InetSocketAddress.checkPort(InetSocketAddress.java:143)
    at java.net.InetSocketAddress.<init>(InetSocketAddress.java:188)
    at java.net.Socket.<init>(Socket.java:244)
    at org.apache.spark.api.python.PythonWorkerFactory.createSocket$1(PythonWorkerFactory.scala:75)
    at org.apache.spark.api.python.PythonWorkerFactory.liftedTree1$1(PythonWorkerFactory.scala:90)
    at org.apache.spark.api.python.PythonWorkerFactory.createThroughDaemon(PythonWorkerFactory.scala:89)
    at org.apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:62)
    at org.apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:135)
    at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:101)
    at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:70)
    at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300)
    at org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
    at org.apache.spark.scheduler.Task.run(Task.scala:88)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
odule named dateutil.tz
?615/12/21 14:27:39 WARN TaskSetManager: Lost task 0.0 in stage 1.0 (TID 4, localhost): java.lang.IllegalArgumentException: port out of range:1315905645
    at java.net.InetSocketAddress.checkPort(InetSocketAddress.java:143)
    at java.net.InetSocketAddress.<init>(InetSocketAddress.java:188)
    at java.net.Socket.<init>(Socket.java:244)
    at org.apache.spark.api.python.PythonWorkerFactory.createSocket$1(PythonWorkerFactory.scala:75)
    at org.apache.spark.api.python.PythonWorkerFactory.liftedTree1$1(PythonWorkerFactory.scala:90)
    at org.apache.spark.api.python.PythonWorkerFactory.createThroughDaemon(PythonWorkerFactory.scala:89)
    at org.apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:62)
    at org.apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:135)
    at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:101)
    at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:70)
    at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300)
    at org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
    at org.apache.spark.scheduler.Task.run(Task.scala:88)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

15/12/21 14:27:39 ERROR TaskSetManager: Task 0 in stage 1.0 failed 1 times; aborting job
15/12/21 14:27:39 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 
15/12/21 14:27:39 INFO TaskSetManager: Lost task 2.0 in stage 1.0 (TID 6) on executor localhost: java.lang.IllegalArgumentException (port out of range:1315905645) [duplicate 1]
15/12/21 14:27:39 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 
15/12/21 14:27:39 INFO TaskSetManager: Lost task 3.0 in stage 1.0 (TID 7) on executor localhost: java.lang.IllegalArgumentException (port out of range:1315905645) [duplicate 2]
15/12/21 14:27:39 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 
15/12/21 14:27:39 INFO TaskSetManager: Lost task 1.0 in stage 1.0 (TID 5) on executor localhost: java.lang.IllegalArgumentException (port out of range:1315905645) [duplicate 3]
15/12/21 14:27:39 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 
15/12/21 14:27:39 INFO TaskSchedulerImpl: Cancelling stage 1
15/12/21 14:27:39 INFO DAGScheduler: ResultStage 1 (count at <ipython-input-3-a0443394e570>:1) failed in 0.537 s
15/12/21 14:27:39 INFO DAGScheduler: Job 1 failed: count at <ipython-input-3-a0443394e570>:1, took 0.564707 s
---------------------------------------------------------------------------
Py4JJavaError                             Traceback (most recent call last)
<ipython-input-3-a0443394e570> in <module>()
----> 1 rdd.count()

/Users/jason/spark-1.5.2-bin-hadoop2.6/python/pyspark/rdd.pyc in count(self)
   1004         3
   1005         """
-> 1006         return self.mapPartitions(lambda i: [sum(1 for _ in i)]).sum()
   1007 
   1008     def stats(self):

/Users/jason/spark-1.5.2-bin-hadoop2.6/python/pyspark/rdd.pyc in sum(self)
    995         6.0
    996         """
--> 997         return self.mapPartitions(lambda x: [sum(x)]).fold(0, operator.add)
    998 
    999     def count(self):

/Users/jason/spark-1.5.2-bin-hadoop2.6/python/pyspark/rdd.pyc in fold(self, zeroValue, op)
    869         # zeroValue provided to each partition is unique from the one provided
    870         # to the final reduce call
--> 871         vals = self.mapPartitions(func).collect()
    872         return reduce(op, vals, zeroValue)
    873 

/Users/jason/spark-1.5.2-bin-hadoop2.6/python/pyspark/rdd.pyc in collect(self)
    771         """
    772         with SCCallSiteSync(self.context) as css:
--> 773             port = self.ctx._jvm.PythonRDD.collectAndServe(self._jrdd.rdd())
    774         return list(_load_from_socket(port, self._jrdd_deserializer))
    775 

/Users/jason/spark-1.5.2-bin-hadoop2.6/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py in __call__(self, *args)
    536         answer = self.gateway_client.send_command(command)
    537         return_value = get_return_value(answer, self.gateway_client,
--> 538                 self.target_id, self.name)
    539 
    540         for temp_arg in temp_args:

/Users/jason/spark-1.5.2-bin-hadoop2.6/python/pyspark/sql/utils.pyc in deco(*a, **kw)
     34     def deco(*a, **kw):
     35         try:
---> 36             return f(*a, **kw)
     37         except py4j.protocol.Py4JJavaError as e:
     38             s = e.java_exception.toString()

/Users/jason/spark-1.5.2-bin-hadoop2.6/python/lib/py4j-0.8.2.1-src.zip/py4j/protocol.py in get_return_value(answer, gateway_client, target_id, name)
    298                 raise Py4JJavaError(
    299                     'An error occurred while calling {0}{1}{2}.\n'.
--> 300                     format(target_id, '.', name), value)
    301             else:
    302                 raise Py4JError(

Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.collectAndServe.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage 1.0 (TID 4, localhost): java.lang.IllegalArgumentException: port out of range:1315905645
    at java.net.InetSocketAddress.checkPort(InetSocketAddress.java:143)
    at java.net.InetSocketAddress.<init>(InetSocketAddress.java:188)
    at java.net.Socket.<init>(Socket.java:244)
    at org.apache.spark.api.python.PythonWorkerFactory.createSocket$1(PythonWorkerFactory.scala:75)
    at org.apache.spark.api.python.PythonWorkerFactory.liftedTree1$1(PythonWorkerFactory.scala:90)
    at org.apache.spark.api.python.PythonWorkerFactory.createThroughDaemon(PythonWorkerFactory.scala:89)
    at org.apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:62)
    at org.apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:135)
    at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:101)
    at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:70)
    at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300)
    at org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
    at org.apache.spark.scheduler.Task.run(Task.scala:88)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

Driver stacktrace:
    at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1283)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1271)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1270)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
    at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1270)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697)
    at scala.Option.foreach(Option.scala:236)
    at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:697)
    at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1496)
    at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1458)
    at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1447)
    at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
    at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:567)
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:1824)
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:1837)
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:1850)
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:1921)
    at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:909)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
    at org.apache.spark.rdd.RDD.withScope(RDD.scala:310)
    at org.apache.spark.rdd.RDD.collect(RDD.scala:908)
    at org.apache.spark.api.python.PythonRDD$.collectAndServe(PythonRDD.scala:405)
    at org.apache.spark.api.python.PythonRDD.collectAndServe(PythonRDD.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:483)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
    at py4j.Gateway.invoke(Gateway.java:259)
    at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
    at py4j.commands.CallCommand.execute(CallCommand.java:79)
    at py4j.GatewayConnection.run(GatewayConnection.java:207)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalArgumentException: port out of range:1315905645
    at java.net.InetSocketAddress.checkPort(InetSocketAddress.java:143)
    at java.net.InetSocketAddress.<init>(InetSocketAddress.java:188)
    at java.net.Socket.<init>(Socket.java:244)
    at org.apache.spark.api.python.PythonWorkerFactory.createSocket$1(PythonWorkerFactory.scala:75)
    at org.apache.spark.api.python.PythonWorkerFactory.liftedTree1$1(PythonWorkerFactory.scala:90)
    at org.apache.spark.api.python.PythonWorkerFactory.createThroughDaemon(PythonWorkerFactory.scala:89)
    at org.apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:62)
    at org.apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:135)
    at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:101)
    at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:70)
    at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300)
    at org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
    at org.apache.spark.scheduler.Task.run(Task.scala:88)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    ... 1 more