Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Can';t从Java连接到Kubernetes中运行的Spark_Java_Apache Spark_Kubernetes_Remote Access_Remote Server - Fatal编程技术网

Can';t从Java连接到Kubernetes中运行的Spark

Can';t从Java连接到Kubernetes中运行的Spark,java,apache-spark,kubernetes,remote-access,remote-server,Java,Apache Spark,Kubernetes,Remote Access,Remote Server,我已经安装了Kuberenetes(适用于Windows 10的minikube),并使用helm添加了Spark: .\helm.exe install --name spark-test stable/spark 然后,我使用 .\kubectl.exe expose deployment spark-test-master --port=7070 --name=spark-master-ext --type=NodePort 例如,我的UI在上运行,spark master暴露于:32

我已经安装了Kuberenetes(适用于Windows 10的minikube),并使用helm添加了Spark:

.\helm.exe install --name spark-test stable/spark
然后,我使用

.\kubectl.exe expose deployment spark-test-master --port=7070 --name=spark-master-ext --type=NodePort
例如,我的UI在上运行,spark master暴露于:32473。为了检查,我执行以下操作:

.\minikube-windows-amd64.exe service spark-master-ext
但当我在Java中这样做时:

SparkConf conf = new SparkConf().setMaster("spark://192.168.1.168:32473").setAppName("Data Extractor");
我有:

18/03/19 13:57:29 WARN AppClient$ClientEndpoint: Could not connect to 192.168.1.168:32473: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@192.168.1.168:32473]
18/03/19 13:57:29 WARN ReliableDeliverySupervisor: Association with remote system [akka.tcp://sparkMaster@192.168.1.168:32473] has failed, address is now gated for [5000] ms. Reason: [Association failed with [akka.tcp://sparkMaster@192.168.1.168:32473]] Caused by: [Connection refused: no further information: /192.168.1.168:32473]
18/03/19 13:57:29 WARN AppClient$ClientEndpoint: Failed to connect to master 192.168.1.168:32473
akka.actor.ActorNotFound: Actor not found for: ActorSelection[Anchor(akka.tcp://sparkMaster@192.168.1.168:32473/), Path(/user/Master)]

有什么想法吗?如何在Minikube中运行Spark上运行Java Spark作业?

Spark的Helm chart看起来真的过时了(1.5.1),所以我在本地安装了2.3.0,它运行时没有任何问题。案件结案,对不起:)