Warning: file_get_contents(/data/phpspider/zhask/data//catemap/5/fortran/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark 以纱线簇模式提交作业时,Livy失败_Apache Spark_Yarn_Livy - Fatal编程技术网

Apache spark 以纱线簇模式提交作业时,Livy失败

Apache spark 以纱线簇模式提交作业时,Livy失败,apache-spark,yarn,livy,Apache Spark,Yarn,Livy,我有一个livy服务器在本地运行,它正在连接到一个远程纱线集群以运行spark作业 当我从Livy编程作业api上传jar时,出现以下错误 看起来没有为请求消息映射正确的netty通道处理程序。由于这个原因,没有创建spark上下文 livy.conf设置: livy.spark.master = yarn livy.spark.deploy-mode = cluster 我能够发布到Livy REST/batches api,它提交并成功完成纱线集群中的工作 使用相同的配置,当我试图从jav

我有一个livy服务器在本地运行,它正在连接到一个远程纱线集群以运行spark作业

当我从Livy编程作业api上传jar时,出现以下错误

看起来没有为请求消息映射正确的netty通道处理程序。由于这个原因,没有创建spark上下文

livy.conf设置:

livy.spark.master = yarn
livy.spark.deploy-mode = cluster
我能够发布到Livy REST/batches api,它提交并成功完成纱线集群中的工作

使用相同的配置,当我试图从java客户机上传jar时,它失败了,错误如下

Livy版本:0.4.0-孵化

HTTP客户端:

<dependency>
    <groupId>org.apache.livy</groupId>
    <artifactId>livy-client-http</artifactId>
    <version>0.4.0-incubating</version>
</dependency>

你能解决这个问题吗?
17/09/22 22:47:50 DEBUG RSCDriver: Registered new connection from [id: 0x46d83447, L:/127.0.0.1:10000 - R:/127.0.0.1:9273].
17/09/22 22:47:50 DEBUG RpcDispatcher: [ReplDriver] Registered outstanding rpc 0 (org.apache.livy.rsc.BaseProtocol$RemoteDriverAddress).
17/09/22 22:47:50 DEBUG KryoMessageCodec: Encoded message of type org.apache.livy.rsc.rpc.Rpc$MessageHeader (5 bytes)
17/09/22 22:47:50 DEBUG KryoMessageCodec: Encoded message of type org.apache.livy.rsc.BaseProtocol$RemoteDriverAddress (90 bytes)
17/09/22 22:47:50 DEBUG KryoMessageCodec: Decoded message of type org.apache.livy.rsc.rpc.Rpc$MessageHeader (5 bytes)
17/09/22 22:47:50 DEBUG KryoMessageCodec: Decoded message of type org.apache.livy.rsc.BaseProtocol$RemoteDriverAddress (90 bytes)
17/09/22 22:47:50 DEBUG RpcDispatcher: [ReplDriver] Received RPC message: type=CALL id=0 payload=org.apache.livy.rsc.BaseProtocol$RemoteDriverAddress
17/09/22 22:47:50 WARN RpcDispatcher: [ReplDriver] Failed to find handler for msg 'org.apache.livy.rsc.BaseProtocol$RemoteDriverAddress'.
17/09/22 22:47:50 DEBUG RpcDispatcher: [ReplDriver] Caught exception in channel pipeline.
java.lang.NullPointerException
at org.apache.livy.rsc.Utils.stackTraceAsString(Utils.java:95)
at org.apache.livy.rsc.rpc.RpcDispatcher.handleCall(RpcDispatcher.java:121)
at org.apache.livy.rsc.rpc.RpcDispatcher.channelRead0(RpcDispatcher.java:77)
at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
at io.netty.handler.codec.ByteToMessageCodec.channelRead(ByteToMessageCodec.java:103)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
at java.lang.Thread.run(Thread.java:745)
17/09/22 22:47:50 WARN RpcDispatcher: [ReplDriver] Closing RPC channel with 1 outstanding RPCs.
17/09/22 22:47:50 WARN RpcDispatcher: [ReplDriver] Closing RPC channel with 1 outstanding RPCs.
17/09/22 22:47:50 ERROR ApplicationMaster: User class threw exception: java.util.concurrent.CancellationException
java.util.concurrent.CancellationException
at io.netty.util.concurrent.DefaultPromise.cancel(...)(Unknown Source)
17/09/22 22:47:50 INFO ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: java.util.concurrent.CancellationException)
17/09/22 22:47:50 ERROR ApplicationMaster: Uncaught exception: 
org.apache.spark.SparkException: Exception thrown in awaitResult: 
at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205)
at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:401)
at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:254)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:764)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$2.run(SparkHadoopUtil.scala:67)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$2.run(SparkHadoopUtil.scala:66)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:66)
at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:762)
at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
Caused by: java.util.concurrent.CancellationException
at io.netty.util.concurrent.DefaultPromise.cancel(...)(Unknown Source)
17/09/22 22:47:50 INFO ApplicationMaster: Unregistering ApplicationMaster with FAILED (diag message: User class threw exception: java.util.concurrent.CancellationException)
17/09/22 22:47:50 INFO ShutdownHookManager: Shutdown hook called.