Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark 如何修复';NoClassDefFoundError:io/fabric8/kubernetes/api/model/apps/Deployment';_Apache Spark_Kubernetes_Apache Zeppelin - Fatal编程技术网

Apache spark 如何修复';NoClassDefFoundError:io/fabric8/kubernetes/api/model/apps/Deployment';

Apache spark 如何修复';NoClassDefFoundError:io/fabric8/kubernetes/api/model/apps/Deployment';,apache-spark,kubernetes,apache-zeppelin,Apache Spark,Kubernetes,Apache Zeppelin,我一直在尝试将齐柏林飞艇(v0.7.3)上的spark解释器集成到Kubernetes集群上。但是,由于在服务器上安装了k8s 1.13.10版,因此 我需要将spark k8s客户端升级到v4.6.1,如下所示 但当我尝试在齐柏林飞艇ui上执行spark命令时,我得到: ERROR [2019-10-25 03:45:35,430] ({pool-2-thread-4} Job.java[run]:181) - Job failed java.lang.NullPointerException

我一直在尝试将齐柏林飞艇(v0.7.3)上的spark解释器集成到Kubernetes集群上。但是,由于在服务器上安装了k8s 1.13.10版,因此

我需要将spark k8s客户端升级到v4.6.1,如下所示

但当我尝试在齐柏林飞艇ui上执行spark命令时,我得到:

ERROR [2019-10-25 03:45:35,430] ({pool-2-thread-4} Job.java[run]:181) - Job failed
java.lang.NullPointerException
 at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:38)
    at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:33)
    at org.apache.zeppelin.spark.SparkInterpreter.createSparkContext_2(SparkInterpreter.java:398)
    at org.apache.zeppelin.spark.SparkInterpreter.createSparkContext(SparkInterpreter.java:387)
    at org.apache.zeppelin.spark.SparkInterpreter.getSparkContext(SparkInterpreter.java:146)
    at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:843)
    at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
以下是我的spark submit配置,但我不认为错误来自这些配置(因为我以前运行过这些配置,它们工作得很好)

我已经尝试将spark-k8s客户端降级到3.x.x,直到4.0.x,但是我得到了HTTP错误。因此,我决定坚持使用v4.6.1。打开齐柏林飞艇解释器日志,我发现以下堆栈跟踪:

ERROR [2019-10-25 03:45:35,428] ({pool-2-thread-4} Utils.java[invokeMethod]:40) -
java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:38)
    at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:33)
    at org.apache.zeppelin.spark.SparkInterpreter.createSparkSession(SparkInterpreter.java:378)
    at org.apache.zeppelin.spark.SparkInterpreter.getSparkSession(SparkInterpreter.java:233)
    at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:841)
    at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
    at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:491)
    at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
    at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NoClassDefFoundError: io/fabric8/kubernetes/api/model/apps/Deployment
    at io.fabric8.kubernetes.client.internal.readiness.Readiness.isReady(Readiness.java:62)
    at org.apache.spark.scheduler.cluster.k8s.KubernetesExternalShuffleManagerImpl$$anonfun$start$1.apply(KubernetesExternalShuffleManager.scala:82)
    at org.apache.spark.scheduler.cluster.k8s.KubernetesExternalShuffleManagerImpl$$anonfun$start$1.apply(KubernetesExternalShuffleManager.scala:81)
    at scala.collection.Iterator$class.foreach(Iterator.scala:893)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
    at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
    at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
    at org.apache.spark.scheduler.cluster.k8s.KubernetesExternalShuffleManagerImpl.start(KubernetesExternalShuffleManager.scala:80)
    at org.apache.spark.scheduler.cluster.k8s.KubernetesClusterSchedulerBackend$$anonfun$start$1.apply(KubernetesClusterSchedulerBackend.scala:212)
    at org.apache.spark.scheduler.cluster.k8s.KubernetesClusterSchedulerBackend$$anonfun$start$1.apply(KubernetesClusterSchedulerBackend.scala:212)
    at scala.Option.foreach(Option.scala:257)
    at org.apache.spark.scheduler.cluster.k8s.KubernetesClusterSchedulerBackend.start(KubernetesClusterSchedulerBackend.scala:212)
    at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:173)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:509)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2509)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:909)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:901)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:901)
    ... 20 more
 INFO [2019-10-25 03:45:35,430] ({pool-2-thread-4} SparkInterpreter.java[createSparkSession]:379) - Created Spark session
ERROR [2019-10-25 03:45:35,430] ({pool-2-thread-4} Job.java[run]:181) - Job failed
java.lang.NullPointerException
    at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:38)
    at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:33)
    at org.apache.zeppelin.spark.SparkInterpreter.createSparkContext_2(SparkInterpreter.java:398)
    at org.apache.zeppelin.spark.SparkInterpreter.createSparkContext(SparkInterpreter.java:387)
    at org.apache.zeppelin.spark.SparkInterpreter.getSparkContext(SparkInterpreter.java:146)
    at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:843)
    at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
    at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:491)
    at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
    at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
 INFO [2019-10-25 03:45:35,431] ({pool-2-thread-4} SchedulerFactory.java[jobFinished]:137) - Job remoteInterpretJob_1571975134433 finished by scheduler org.apache.zeppelin.spark.SparkInterpreter819422312

这是我在这里的第一篇文章,如果我没有遵守某些规则,请纠正我。谢谢

经过一些严格的研究和同事的帮助,我能够验证kubernetes-model-v2.0.0中不存在
io/fabric8/kubernetes/api/model/apps/Deployment
。将jar升级到v3.0.0解决了这个问题

经过一些严格的研究和同事的帮助,我能够验证kubernetes-model-v2.0.0中不存在
io/fabric8/kubernetes/api/model/apps/Deployment
。将jar升级到v3.0.0解决了这个问题

ERROR [2019-10-25 03:45:35,428] ({pool-2-thread-4} Utils.java[invokeMethod]:40) -
java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:38)
    at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:33)
    at org.apache.zeppelin.spark.SparkInterpreter.createSparkSession(SparkInterpreter.java:378)
    at org.apache.zeppelin.spark.SparkInterpreter.getSparkSession(SparkInterpreter.java:233)
    at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:841)
    at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
    at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:491)
    at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
    at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NoClassDefFoundError: io/fabric8/kubernetes/api/model/apps/Deployment
    at io.fabric8.kubernetes.client.internal.readiness.Readiness.isReady(Readiness.java:62)
    at org.apache.spark.scheduler.cluster.k8s.KubernetesExternalShuffleManagerImpl$$anonfun$start$1.apply(KubernetesExternalShuffleManager.scala:82)
    at org.apache.spark.scheduler.cluster.k8s.KubernetesExternalShuffleManagerImpl$$anonfun$start$1.apply(KubernetesExternalShuffleManager.scala:81)
    at scala.collection.Iterator$class.foreach(Iterator.scala:893)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
    at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
    at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
    at org.apache.spark.scheduler.cluster.k8s.KubernetesExternalShuffleManagerImpl.start(KubernetesExternalShuffleManager.scala:80)
    at org.apache.spark.scheduler.cluster.k8s.KubernetesClusterSchedulerBackend$$anonfun$start$1.apply(KubernetesClusterSchedulerBackend.scala:212)
    at org.apache.spark.scheduler.cluster.k8s.KubernetesClusterSchedulerBackend$$anonfun$start$1.apply(KubernetesClusterSchedulerBackend.scala:212)
    at scala.Option.foreach(Option.scala:257)
    at org.apache.spark.scheduler.cluster.k8s.KubernetesClusterSchedulerBackend.start(KubernetesClusterSchedulerBackend.scala:212)
    at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:173)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:509)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2509)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:909)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:901)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:901)
    ... 20 more
 INFO [2019-10-25 03:45:35,430] ({pool-2-thread-4} SparkInterpreter.java[createSparkSession]:379) - Created Spark session
ERROR [2019-10-25 03:45:35,430] ({pool-2-thread-4} Job.java[run]:181) - Job failed
java.lang.NullPointerException
    at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:38)
    at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:33)
    at org.apache.zeppelin.spark.SparkInterpreter.createSparkContext_2(SparkInterpreter.java:398)
    at org.apache.zeppelin.spark.SparkInterpreter.createSparkContext(SparkInterpreter.java:387)
    at org.apache.zeppelin.spark.SparkInterpreter.getSparkContext(SparkInterpreter.java:146)
    at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:843)
    at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
    at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:491)
    at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
    at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
 INFO [2019-10-25 03:45:35,431] ({pool-2-thread-4} SchedulerFactory.java[jobFinished]:137) - Job remoteInterpretJob_1571975134433 finished by scheduler org.apache.zeppelin.spark.SparkInterpreter819422312
%spark
%sc.version