Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark 想要在Jupyter笔记本中运行Spark(scala)内核。获取操作错误:[WinError 193]%1不是有效的Win32应用程序_Apache Spark_Kernel_Jupyter Notebook_Jupyter_Spark Notebook - Fatal编程技术网

Apache spark 想要在Jupyter笔记本中运行Spark(scala)内核。获取操作错误:[WinError 193]%1不是有效的Win32应用程序

Apache spark 想要在Jupyter笔记本中运行Spark(scala)内核。获取操作错误:[WinError 193]%1不是有效的Win32应用程序,apache-spark,kernel,jupyter-notebook,jupyter,spark-notebook,Apache Spark,Kernel,Jupyter Notebook,Jupyter,Spark Notebook,它在内核列表中可用。但在从笔记本中选择Spark内核时显示错误 { "display_name": "Spark 1.5.1 (Scala 2.10.4)", "language_info": { "name": "scala" }, "argv": [ "C:/Users/RDX/spark-kernel-master/dist/spark-kernel/bin/spark-kernel", "--profile", "{

它在内核列表中可用。但在从笔记本中选择Spark内核时显示错误

{
    "display_name": "Spark 1.5.1 (Scala 2.10.4)",
    "language_info": { "name": "scala" },
    "argv": [
        "C:/Users/RDX/spark-kernel-master/dist/spark-kernel/bin/spark-kernel",
        "--profile",
        "{connection_file}"
    ],
    "codemirror_mode": "scala",
    "env": {
        "SPARK_OPTS": "--master=local[2] --driver-java-options=-Xms1024M --driver-java-options=-Xmx4096M --driver-java-options=-Dlog4j.logLevel=info",
        "MAX_INTERPRETER_THREADS": "16",
        "CAPTURE_STANDARD_OUT": "true",
        "CAPTURE_STANDARD_ERR": "true",
        "SEND_EMPTY_OUTPUT": "false",
        "SPARK_HOME": "X:\\Softwares\\BIG_Data_files\\spark-2.0.1-bin-hadoop2.7",
        "PYTHONPATH": "X:\\Softwares\\BIG_Data_files\\spark-2.0.1-bin-hadoop2.7/python:X:\\Softwares\\BIG_Data_files\\spark-2.0.1-bin-hadoop2.7/python/lib/py4j-0.10.3-src.zip"
     }
}
可用内核:

pyspark C:\Users\RDX.ipython\kernels\pyspark python3 c:\users\rdx\anaconda3\lib\site packages\ipykernel\resources spark C:\ProgramData\jupyter\kernels\spark

{
    "display_name": "Spark 1.5.1 (Scala 2.10.4)",
    "language_info": { "name": "scala" },
    "argv": [
        "C:/Users/RDX/spark-kernel-master/dist/spark-kernel/bin/spark-kernel",
        "--profile",
        "{connection_file}"
    ],
    "codemirror_mode": "scala",
    "env": {
        "SPARK_OPTS": "--master=local[2] --driver-java-options=-Xms1024M --driver-java-options=-Xmx4096M --driver-java-options=-Dlog4j.logLevel=info",
        "MAX_INTERPRETER_THREADS": "16",
        "CAPTURE_STANDARD_OUT": "true",
        "CAPTURE_STANDARD_ERR": "true",
        "SEND_EMPTY_OUTPUT": "false",
        "SPARK_HOME": "X:\\Softwares\\BIG_Data_files\\spark-2.0.1-bin-hadoop2.7",
        "PYTHONPATH": "X:\\Softwares\\BIG_Data_files\\spark-2.0.1-bin-hadoop2.7/python:X:\\Softwares\\BIG_Data_files\\spark-2.0.1-bin-hadoop2.7/python/lib/py4j-0.10.3-src.zip"
     }
}
ipython kernelspec list