Pyspark Pypark运行错误
我连接的Spark不是建立在本地计算机上,而是远程计算机。每次我连接到它时,错误都会显示:Pyspark Pypark运行错误,pyspark,Pyspark,我连接的Spark不是建立在本地计算机上,而是远程计算机。每次我连接到它时,错误都会显示: [IPKernelApp] WARNING | Unknown error in handling PYTHONSTARTUP file /usr/local/spark/python/pyspark/shell.py: 18/03/07 08:52:53 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjec
[IPKernelApp] WARNING | Unknown error in handling PYTHONSTARTUP file /usr/local/spark/python/pyspark/shell.py:
18/03/07 08:52:53 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectException
不管怎样,我仍在试着在Jupyter笔记本上运行它:
from pyspark.conf import SparkConf
SparkSession.builder.config(conf=SparkConf())
dir(spark)
当我昨天运行它时,它显示目录。我今天做的时候,上面写着:
NameError: name 'spark' is not defined
任何建议都将不胜感激 缺少spark变量
from pyspark.conf import SparkConf
spark=SparkSession.builder.config(conf=SparkConf())
dir(spark)