Apache spark “阿帕奇火花”;Py4JError:Java端的答案是空的;
我每次都会犯这个错误。。。 我用起泡水。。。 我的配置文件:Apache spark “阿帕奇火花”;Py4JError:Java端的答案是空的;,apache-spark,Apache Spark,我每次都会犯这个错误。。。 我用起泡水。。。 我的配置文件: ***"spark.driver.memory 65g spark.python.worker.memory 65g spark.master local[*]"*** 数据量约为5 Gb。 没有关于此错误的其他信息。。。 有人知道为什么会这样吗?谢谢大家! ***"ERROR:py4j.java_gateway:Error while sending or receiving. Traceback (most recent cal
***"spark.driver.memory 65g
spark.python.worker.memory 65g
spark.master local[*]"***
数据量约为5 Gb。
没有关于此错误的其他信息。。。
有人知道为什么会这样吗?谢谢大家!
***"ERROR:py4j.java_gateway:Error while sending or receiving.
Traceback (most recent call last):
File "/data/analytics/Spark1.6.1/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py", line 746, in send_command
raise Py4JError("Answer from Java side is empty")
Py4JError: Answer from Java side is empty
ERROR:py4j.java_gateway:An error occurred while trying to connect to the Java server
Traceback (most recent call last):
File "/data/analytics/Spark1.6.1/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py", line 690, in start
self.socket.connect((self.address, self.port))
File "/usr/local/anaconda/lib/python2.7/socket.py", line 228, in meth
return getattr(self._sock,name)(*args)
error: [Errno 111] Connection refused
ERROR:py4j.java_gateway:An error occurred while trying to connect to the Java server
Traceback (most recent call last):
File "/data/analytics/Spark1.6.1/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py", line 690, in start
self.socket.connect((self.address, self.port))
File "/usr/local/anaconda/lib/python2.7/socket.py", line 228, in meth
return getattr(self._sock,name)(*args)
error: [Errno 111] Connection refused
ERROR:py4j.java_gateway:An error occurred while trying to connect to the Java server
Traceback (most recent call last):
File "/data/analytics/Spark1.6.1/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py", line 690, in start
self.socket.connect((self.address, self.port))
File "/usr/local/anaconda/lib/python2.7/socket.py", line 228, in meth
return getattr(self._sock,name)(*args)
error: [Errno 111] Connection refused"***
您是否尝试过在spark配置文件中设置
spark.executor.memory
和spark.driver.memory
有关更多信息,请参阅