Apache spark 本地节点和远程节点具有不同的版本号
我正在尝试将我的spark客户端连接到ignite cluster 2.5.0版。当我运行spark提交作业时Apache spark 本地节点和远程节点具有不同的版本号,apache-spark,kubernetes,ignite,Apache Spark,Kubernetes,Ignite,我正在尝试将我的spark客户端连接到ignite cluster 2.5.0版。当我运行spark提交作业时 sudo /opt/spark-2.3.0-bin-hadoop2.7/bin/spark-submit --master k8s://https://35.192.214.68 --deploy-mode cluster --name sparkIgnite --class org.blk.igniteSparkResearch.ScalarSharedRDDExample --co
sudo /opt/spark-2.3.0-bin-hadoop2.7/bin/spark-submit --master k8s://https://35.192.214.68 --deploy-mode cluster --name sparkIgnite --class org.blk.igniteSparkResearch.ScalarSharedRDDExample --conf spark.executor.instances=3 --conf spark.app.name=sharedSparkIgnite --conf spark.kubernetes.authenticate.driver.serviceAccountName=ignite --conf spark.kubernetes.container.image=us.gcr.io/nlp-research-198620/ignite-spark:v2 local:///opt/spark/jars/igniteSpark-1.0-SNAPSHOT-jar-with-dependencies.jar
我得到以下错误
class org.apache.ignite.IgniteException: Failed to start manager: GridManagerAdapter [enabled=true, name=org.apache.ignite.internal.managers.discovery.GridDiscoveryManager
Caused by: class org.apache.ignite.spi.IgniteSpiException: Local node and remote node have different version numbers (node will not join, Ignite does not support rolling updates, so versions must be exactly the same) [locBuildVer=2.5.0, rmtBuildVer=2.4.0, locNodeAddrs=[ignite-cluster-68787659f9-626k6/0:0:0:0:0:0:0:1%lo, /10.8.2.82, /127.0.0.1], rmtNodeAddrs=[sparkignite-8bad224a0187324ba6f98da08e152c5e-driver/0:0:0:0:0:0:0:1%lo, /10.8.2.87, /127.0.0.1], locNodeId=d0c82763-8931-4d66-89ad-2689d8b3d01a, rmtNodeId=b6383f55-ecc6-4d5f-acb2-4bd9970d3fc2]
错误显示我的点火群集(本地)使用2.5.0,而我的远程pod使用2.4.0。但是我的ignite包含2.5.0 ignite映像,那么2.4.0版本从何而来
聚甲醛
这里我的spark映像是使用spark-2.3.0附带的Dockerfile构建的
从您的日志中,我看到您 1) 点火-组合仪表-68787659f9-626k6和2.5.0 2) sparkignite-8BAD224A0187324BA6F98DA08E152C5电子驱动程序,版本为2.4.0 请检查您的spark配置中是否有2.5.0 ignite版本
我在pom文件中使用了2.5.0,甚至我的本地系统也使用了2.5.0。有没有办法让spark仍然使用2.4.0?也许你忘了更改spark类路径,它看起来是2.4.0库而不是2.5.0库?能否请您描述一下您如何部署ignite和spark的步骤?1。使用ignite版本2.5.0在Google Kubernetes引擎上创建ignite群集。还检查了日志。2.在我的jar上创建了一个映像,我需要运行该映像来点燃第一步创建的节点。创建映像后,使用spark submit命令在K8s群集上运行映像。步骤1已清除。我想这是第二步的原因。你能检查一下你的图像是否使用了正确版本的Ignite库吗?因为我所有的pom依赖项都在2.5.0中,我不明白为什么我的spark仍然使用2.4.0。
?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.blk</groupId>
<artifactId>igniteSpark</artifactId>
<version>1.0-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.8</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.3.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.3.0</version>
</dependency>
<dependency>
<groupId>org.apache.ignite</groupId>
<artifactId>ignite-spark_2.10</artifactId>
<version>2.5.0</version>
</dependency>
<dependency>
<groupId>org.apache.ignite</groupId>
<artifactId>ignite-scalar_2.10</artifactId>
<version>2.5.0</version>
</dependency>
<dependency>
<groupId>org.apache.ignite</groupId>
<artifactId>ignite-kubernetes</artifactId>
<version>2.5.0</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<version>2.15.2</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.8</source>
</configuration>
</plugin>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>org.blk.igniteSparkResearch.ScalarSharedRDDExample</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
FROM spark:latest
RUN mkdir -p /opt/spark/jars
COPY target/igniteSpark-1.0-SNAPSHOT-jar-with-dependencies.jar /opt/spark/jars