Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Docker kubernetes上的Spark:Executor pods在创建sparkContext时无法启动和关闭_Docker_Apache Spark_Kubernetes_Pyspark_Amazon Eks - Fatal编程技术网

Docker kubernetes上的Spark:Executor pods在创建sparkContext时无法启动和关闭

Docker kubernetes上的Spark:Executor pods在创建sparkContext时无法启动和关闭,docker,apache-spark,kubernetes,pyspark,amazon-eks,Docker,Apache Spark,Kubernetes,Pyspark,Amazon Eks,我试图在kubernetes上运行Spark,并通过Spark shell或jupyter界面运行交互式命令。 我已经为驱动程序吊舱和执行器吊舱构建了自定义图像,并使用下面的代码来加速Spark上下文 import pyspark conf = pyspark.SparkConf() conf.setMaster("k8s://https://kubernetes.default.svc.cluster.local:443") conf.set( "spar

我试图在kubernetes上运行Spark,并通过Spark shell或jupyter界面运行交互式命令。 我已经为驱动程序吊舱和执行器吊舱构建了自定义图像,并使用下面的代码来加速Spark上下文

import pyspark
conf = pyspark.SparkConf()
conf.setMaster("k8s://https://kubernetes.default.svc.cluster.local:443")
conf.set(
    "spark.kubernetes.container.image", 
    "<Repo>/<IMAGENAME>:latest") 

conf.set("spark.kubernetes.namespace": "default")

# Authentication certificate and token (required to create worker pods):
conf.set(
    "spark.kubernetes.authenticate.caCertFile", 
    "/var/run/secrets/kubernetes.io/serviceaccount/ca.crt")
conf.set(
    "spark.kubernetes.authenticate.oauthTokenFile", 
    "/var/run/secrets/kubernetes.io/serviceaccount/token")

conf.set(
    "spark.kubernetes.authenticate.driver.serviceAccountName", 
    "spark-master") 
conf.set("spark.executor.instances", "2") 
conf.set(
    "spark.driver.host", "spark-test-jupyter") 
conf.set("spark.executor.memory", "1g")
conf.set("spark.executor.cores", "1")
conf.set("spark.driver.blockManager.port", "7777")
conf.set("spark.driver.bindAddress", "0.0.0.0")

conf.set("spark.driver.port", "29416") 

sc = pyspark.SparkContext(conf=conf)
这种情况一直持续到停止


这里可能出了什么问题?

您的
spark.driver.host
应该是该服务的DNS,所以类似
spark test jupyter.default.svc.cluster.local
的东西可以添加崩溃的播客的日志吗?我能够继续,是拼写错误的服务名称的愚蠢错误。但我看到的新错误是executor无法连接到上面配置中的驱动程序端口29416@沃纳,你知道这和什么有关吗?
pyspark-shell-1620894878554-exec-8   0/1     Pending             0          0s
pyspark-shell-1620894878554-exec-8   0/1     ContainerCreating   0          0s
pyspark-shell-1620894878528-exec-7   1/1     Running             0          1s
pyspark-shell-1620894878554-exec-8   1/1     Running             0          2s
pyspark-shell-1620894878528-exec-7   0/1     Error               0          4s
pyspark-shell-1620894878554-exec-8   0/1     Error               0          4s
pyspark-shell-1620894878528-exec-7   0/1     Terminating         0          5s
pyspark-shell-1620894878528-exec-7   0/1     Terminating         0          5s
pyspark-shell-1620894878554-exec-8   0/1     Terminating         0          5s
pyspark-shell-1620894878554-exec-8   0/1     Terminating         0          5s
pyspark-shell-1620894883595-exec-9   0/1     Pending             0          0s
pyspark-shell-1620894883595-exec-9   0/1     Pending             0          0s
pyspark-shell-1620894883595-exec-9   0/1     ContainerCreating   0          0s
pyspark-shell-1620894883623-exec-10   0/1     Pending             0          0s
pyspark-shell-1620894883623-exec-10   0/1     Pending             0          0s
pyspark-shell-1620894883623-exec-10   0/1     ContainerCreating   0          0s
pyspark-shell-1620894883595-exec-9    1/1     Running             0          1s
pyspark-shell-1620894883623-exec-10   1/1     Running             0          3s