Apache spark Spark3 interperter不是从齐柏林飞艇发射的

Apache spark Spark3 interperter不是从齐柏林飞艇发射的,apache-spark,pyspark,apache-spark-sql,apache-zeppelin,Apache Spark,Pyspark,Apache Spark Sql,Apache Zeppelin,我已经在我的笔记本电脑上安装了齐柏林飞艇。我已经从源代码构建了它,我配置的spark版本是3.x。 每当我试图创建一个spark笔记本,使用spark并尝试执行一个简单的行时,我都会遇到以下错误: ERROR [2020-09-15 23:40:17,033] ({FIFOScheduler-interpreter_1215078524-Worker-1} SparkInterpreter.java[open]:121) - Fail to open SparkInterpr

我已经在我的笔记本电脑上安装了齐柏林飞艇。我已经从源代码构建了它,我配置的spark版本是3.x。 每当我试图创建一个spark笔记本,使用spark并尝试执行一个简单的行时,我都会遇到以下错误:

    ERROR [2020-09-15 23:40:17,033] ({FIFOScheduler-interpreter_1215078524-Worker-1} 
     SparkInterpreter.java[open]:121) - Fail to open SparkInterpreter
     java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.zeppelin.spark.BaseSparkScalaInterpreter.spark2CreateContext(BaseSparkScalaInterpreter.scala:301)
        at org.apache.zeppelin.spark.BaseSparkScalaInterpreter.createSparkContext(BaseSparkScalaInterpreter.scala:230)
        at org.apache.zeppelin.spark.SparkScala212Interpreter.open(SparkScala212Interpreter.scala:90)
        at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:106)
        at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
        at org.apache.zeppelin.interpreter.Interpreter.getInterpreterInTheSameSessionByClassName(Interpreter.java:355)
        at org.apache.zeppelin.interpreter.Interpreter.getInterpreterInTheSameSessionByClassName(Interpreter.java:366)
        at org.apache.zeppelin.spark.PySparkInterpreter.open(PySparkInterpreter.java:89)
        at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
        at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:760)
        at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:668)
        at org.apache.zeppelin.scheduler.Job.run(Job.java:172)
        at org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:130)
        at org.apache.zeppelin.scheduler.FIFOScheduler.lambda$runJobInScheduler$0(FIFOScheduler.java:39)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
**Caused by: java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator.defaultNumHeapArena()I**
        at org.apache.spark.network.util.NettyUtils.createPooledByteBufAllocator(NettyUtils.java:161)
        at org.apache.spark.network.util.NettyUtils.getSharedPooledByteBufAllocator(NettyUtils.java:138)
        at org.apache.spark.network.client.TransportClientFactory.<init>(TransportClientFactory.java:107)
        at org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:142)
        at org.apache.spark.rpc.netty.NettyRpcEnv.<init>(NettyRpcEnv.scala:77)
错误[2020-09-15 23:40:17033]({FIFOScheduler-translator_1215078524-Worker-1}
java[打开]:121)-无法打开SparkInterpreter
java.lang.reflect.InvocationTargetException
在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处
位于sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)中
位于java.lang.reflect.Method.invoke(Method.java:498)
位于org.apache.zeppelin.spark.BaseSparkScalExplorer.spark2CreateContext(BaseSparkScalExplorer.scala:301)
在org.apache.zeppelin.spark.baseSparkScalaExplorer.createSparkContext上(baseSparkScalaExplorer.scala:230)
位于org.apache.zeppelin.spark.SparkScala212Interpreter.open(SparkScala212Interpreter.scala:90)
位于org.apache.zeppelin.spark.sparkinterpeter.open(sparkinterpeter.java:106)
在org.apache.zeppelin.explorer.lazyopenexplorer.open(lazyopenexplorer.java:70)
位于org.apache.zeppelin.explorer.explorer.getExplorerIntheSameSessionByClassName(explorer.java:355)
位于org.apache.zeppelin.explorer.explorer.getExplorerInTheSameSessionByClassName(explorer.java:366)
位于org.apache.zeppelin.spark.pysparkinterpeter.open(pysparkinterpeter.java:89)
在org.apache.zeppelin.explorer.lazyopenexplorer.open(lazyopenexplorer.java:70)
位于org.apache.zeppelin.explorer.remote.remoteExplorerserver$interpretajob.jobRun(remoteExplorerserver.java:760)
位于org.apache.zeppelin.explorer.remote.remoteExplorerserver$interpretajob.jobRun(remoteExplorerserver.java:668)
位于org.apache.zeppelin.scheduler.Job.run(Job.java:172)
位于org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:130)
位于org.apache.zeppelin.scheduler.FIFOScheduler.lambda$runJobInScheduler$0(FIFOScheduler.java:39)
位于java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
位于java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
运行(Thread.java:748)
**原因:java.lang.NoSuchMethodError:io.netty.buffer.PooledByteBuffAllocator.defaultNumHeapArena()I**
位于org.apache.spark.network.util.NettyUtils.createPoolledBytebufallocator(NettyUtils.java:161)
位于org.apache.spark.network.util.NettyUtils.getSharedPooledBytebuf分配器(NettyUtils.java:138)
位于org.apache.spark.network.client.TransportClientFactory。(TransportClientFactory.java:107)
位于org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:142)
位于org.apache.spark.rpc.netty.nettyrpcev.(nettyrpcev.scala:77)