Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/http/4.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark kubernetes和livy的spark上的笔画_Apache Spark_Kubernetes_Pyspark_Data Science - Fatal编程技术网

Apache spark kubernetes和livy的spark上的笔画

Apache spark kubernetes和livy的spark上的笔画,apache-spark,kubernetes,pyspark,data-science,Apache Spark,Kubernetes,Pyspark,Data Science,我正在尝试导入在kubernetes上运行的spark cluster上的图形帧 spark版本:2.3.0, livy和sparkmagic.magics用于使用Jupiter笔记本连接到集群 我试过的第一个代码: %load_ext sparkmagic.magics %%spark config {"executorCores":2,"numExecutors":2,"executorMemory":"5G",

我正在尝试导入在kubernetes上运行的spark cluster上的图形帧

spark版本:2.3.0, livy和sparkmagic.magics用于使用Jupiter笔记本连接到集群

我试过的第一个代码:

%load_ext sparkmagic.magics
%%spark config
{"executorCores":2,"numExecutors":2,"executorMemory":"5G","conf":{"spark.jars.packages":"graphframes:graphframes:0.7.0-spark2.3-s_2.11"}}
%spark add -s session_name -l python -u http://spark-livy.cluster:8998
%%spark -s session_name
from graphframes import *
当我尝试导入graphframe时出错,在这种情况下spark应用程序成功启动

我试过的第二个代码

%load_ext sparkmagic.magics
%%spark config
{"executorCores":2,"numExecutors":2,"executorMemory":"5G","jars":["graphframes:graphframes:0.5.0-spark2.0-s_2.11"],"conf":{"spark.jars.packages":"graphframes:graphframes:0.5.0-spark2.0-s_2.11"}
%spark add -s session_name -l python -u http://spark-livy.cluster:8998
}

启动spark会话时出错

关于如何使用livy在kubernetes集群的spark上运行Graphframe的任何帮助都将非常有用