Pyspark/Pyspark内核在jupyter笔记本中不起作用

Pyspark/Pyspark内核在jupyter笔记本中不起作用,pyspark,jupyter-notebook,Pyspark,Jupyter Notebook,以下是已安装的内核: $jupyter-kernelspec list Available kernels: apache_toree_scala /usr/local/share/jupyter/kernels/apache_toree_scala apache_toree_sql /usr/local/share/jupyter/kernels/apache_toree_sql pyspark3kernel /usr/local/share/j

以下是已安装的内核:

 $jupyter-kernelspec list


Available kernels:
  apache_toree_scala    /usr/local/share/jupyter/kernels/apache_toree_scala
  apache_toree_sql      /usr/local/share/jupyter/kernels/apache_toree_sql
  pyspark3kernel        /usr/local/share/jupyter/kernels/pyspark3kernel
  pysparkkernel         /usr/local/share/jupyter/kernels/pysparkkernel
  python3               /usr/local/share/jupyter/kernels/python3
  sparkkernel           /usr/local/share/jupyter/kernels/sparkkernel
  sparkrkernel          /usr/local/share/jupyter/kernels/sparkrkernel
创建了一个新笔记本,但失败

The code failed because of a fatal error:
    Error sending http request and maximum retry encountered..


如果您使用
magicspark
连接您的Jupiter笔记本,则在
jupyter
控制台中没有[错误]消息,您还应启动Livy,这是magicspark用于与您的Spark群集通话的API服务

  • 从下载
    Livy
    并将其解压缩
  • 检查SPARK_HOME environment是否已设置,如果未设置,请设置为SPARK安装目录
  • 通过shell/命令行中的
    /bin/Livy server
    运行Livy服务器

  • 现在回到您的笔记本,您应该能够在cell中运行spark代码。

    Thx获取提示!我现在不使用jupyter笔记本,但我会回来(可能在八月份)。目前将进行投票,并考虑在验证的基础上在当时授予。