Apache spark 想要在Jupyter笔记本中运行Spark(scala)内核。获取操作错误:[WinError 193]%1不是有效的Win32应用程序
它在内核列表中可用。但在从笔记本中选择Spark内核时显示错误Apache spark 想要在Jupyter笔记本中运行Spark(scala)内核。获取操作错误:[WinError 193]%1不是有效的Win32应用程序,apache-spark,kernel,jupyter-notebook,jupyter,spark-notebook,Apache Spark,Kernel,Jupyter Notebook,Jupyter,Spark Notebook,它在内核列表中可用。但在从笔记本中选择Spark内核时显示错误 { "display_name": "Spark 1.5.1 (Scala 2.10.4)", "language_info": { "name": "scala" }, "argv": [ "C:/Users/RDX/spark-kernel-master/dist/spark-kernel/bin/spark-kernel", "--profile", "{
{
"display_name": "Spark 1.5.1 (Scala 2.10.4)",
"language_info": { "name": "scala" },
"argv": [
"C:/Users/RDX/spark-kernel-master/dist/spark-kernel/bin/spark-kernel",
"--profile",
"{connection_file}"
],
"codemirror_mode": "scala",
"env": {
"SPARK_OPTS": "--master=local[2] --driver-java-options=-Xms1024M --driver-java-options=-Xmx4096M --driver-java-options=-Dlog4j.logLevel=info",
"MAX_INTERPRETER_THREADS": "16",
"CAPTURE_STANDARD_OUT": "true",
"CAPTURE_STANDARD_ERR": "true",
"SEND_EMPTY_OUTPUT": "false",
"SPARK_HOME": "X:\\Softwares\\BIG_Data_files\\spark-2.0.1-bin-hadoop2.7",
"PYTHONPATH": "X:\\Softwares\\BIG_Data_files\\spark-2.0.1-bin-hadoop2.7/python:X:\\Softwares\\BIG_Data_files\\spark-2.0.1-bin-hadoop2.7/python/lib/py4j-0.10.3-src.zip"
}
}
可用内核:
pyspark C:\Users\RDX.ipython\kernels\pyspark
python3 c:\users\rdx\anaconda3\lib\site packages\ipykernel\resources
spark C:\ProgramData\jupyter\kernels\spark
{
"display_name": "Spark 1.5.1 (Scala 2.10.4)",
"language_info": { "name": "scala" },
"argv": [
"C:/Users/RDX/spark-kernel-master/dist/spark-kernel/bin/spark-kernel",
"--profile",
"{connection_file}"
],
"codemirror_mode": "scala",
"env": {
"SPARK_OPTS": "--master=local[2] --driver-java-options=-Xms1024M --driver-java-options=-Xmx4096M --driver-java-options=-Dlog4j.logLevel=info",
"MAX_INTERPRETER_THREADS": "16",
"CAPTURE_STANDARD_OUT": "true",
"CAPTURE_STANDARD_ERR": "true",
"SEND_EMPTY_OUTPUT": "false",
"SPARK_HOME": "X:\\Softwares\\BIG_Data_files\\spark-2.0.1-bin-hadoop2.7",
"PYTHONPATH": "X:\\Softwares\\BIG_Data_files\\spark-2.0.1-bin-hadoop2.7/python:X:\\Softwares\\BIG_Data_files\\spark-2.0.1-bin-hadoop2.7/python/lib/py4j-0.10.3-src.zip"
}
}
ipython kernelspec list