Pyspark/Pyspark内核在jupyter笔记本中不起作用
以下是已安装的内核:Pyspark/Pyspark内核在jupyter笔记本中不起作用,pyspark,jupyter-notebook,Pyspark,Jupyter Notebook,以下是已安装的内核: $jupyter-kernelspec list Available kernels: apache_toree_scala /usr/local/share/jupyter/kernels/apache_toree_scala apache_toree_sql /usr/local/share/jupyter/kernels/apache_toree_sql pyspark3kernel /usr/local/share/j
$jupyter-kernelspec list
Available kernels:
apache_toree_scala /usr/local/share/jupyter/kernels/apache_toree_scala
apache_toree_sql /usr/local/share/jupyter/kernels/apache_toree_sql
pyspark3kernel /usr/local/share/jupyter/kernels/pyspark3kernel
pysparkkernel /usr/local/share/jupyter/kernels/pysparkkernel
python3 /usr/local/share/jupyter/kernels/python3
sparkkernel /usr/local/share/jupyter/kernels/sparkkernel
sparkrkernel /usr/local/share/jupyter/kernels/sparkrkernel
创建了一个新笔记本,但失败
The code failed because of a fatal error:
Error sending http request and maximum retry encountered..
如果您使用
magicspark
连接您的Jupiter笔记本,则在jupyter
控制台中没有[错误]消息,您还应启动Livy,这是magicspark用于与您的Spark群集通话的API服务
Livy
并将其解压缩/bin/Livy server
运行Livy服务器现在回到您的笔记本,您应该能够在cell中运行spark代码。Thx获取提示!我现在不使用jupyter笔记本,但我会回来(可能在八月份)。目前将进行投票,并考虑在验证的基础上在当时授予。