Apache spark 运行livy的Spark时出错

Apache spark 运行livy的Spark时出错,apache-spark,livy,Apache Spark,Livy,我使用livy运行我的Spark工作,但是,我得到以下例外 java.util.concurrent.ExecutionException: java.io.IOException: Internal Server Error: "java.util.concurrent.ExecutionException: org.apache.livy.rsc.rpc.RpcException: java.util.NoSuchElementException: cd1299a0-9c19-4db2-b8

我使用livy运行我的Spark工作,但是,我得到以下例外

java.util.concurrent.ExecutionException: java.io.IOException: Internal Server Error: "java.util.concurrent.ExecutionException: org.apache.livy.rsc.rpc.RpcException: java.util.NoSuchElementException: cd1299a0-9c19-4db2-b81b-deba9bf5a594\norg.apache.livy.rsc.driver.RSCDriver.handle(RSCDriver.java:454)\nsun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\nsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)\nsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\njava.lang.reflect.Method.invoke(Method.java:497)\norg.apache.livy.rsc.rpc.RpcDispatcher.handleCall(RpcDispatcher.java:130)\norg.apache.livy.rsc.rpc.RpcDispatcher.channelRead0(RpcDispatcher.java:77)\nio.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)\nio.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)\nio.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)\nio.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)\nio.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)\nio.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)\nio.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:244)\nio.netty.handler.codec.ByteToMessageCodec.channelRead(ByteToMessageCodec.java:103)\nio.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)\nio.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)\nio.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)\nio.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)\nio.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)\nio.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)\nio.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)\nio.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)\nio.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)\njava.lang.Thread.run(Thread.java:745)"
查看Spark作业日志,其中没有错误或异常,但它显示

在spark日志中,我没有看到任何异常,但我看到以下日志:

17/11/14 14:45:49 WARN NettyRpcEnv: RpcEnv already stopped.
17/11/14 14:45:49 INFO YarnAllocator: Completed container container_e60_1510219626098_0394_01_000013 on host: AUPER01-02-10-12-0.prod.vroc.com.au (state: COMPLETE, exit status: 0)
17/11/14 14:45:49 INFO YarnAllocator: Executor for container container_e60_1510219626098_0394_01_000013 exited because of a YARN event (e.g., pre-emption) and not because of an error in the running job.
17/11/14 14:45:49 WARN NettyRpcEnv: RpcEnv already stopped.
17/11/14 14:45:49 INFO YarnAllocator: Completed container container_e60_1510219626098_0394_01_000011 on host: AUPER01-01-10-13-0.prod.vroc.com.au (state: COMPLETE, exit status: 0)
17/11/14 14:45:49 INFO YarnAllocator: Executor for container container_e60_1510219626098_0394_01_000011 exited because of a YARN event (e.g., pre-emption) and not because of an error in the running job.
17/11/14 14:45:49 WARN NettyRpcEnv: RpcEnv already stopped.
17/11/14 14:45:49 INFO YarnAllocator: Completed container container_e60_1510219626098_0394_01_000005 on host: AUPER01-01-20-08-0.prod.vroc.com.au (state: COMPLETE, exit status: 0)
17/11/14 14:45:49 INFO YarnAllocator: Executor for container container_e60_1510219626098_0394_01_000005 exited because of a YARN event (e.g., pre-emption) and not because of an error in the running job.
17/11/14 14:45:49 WARN NettyRpcEnv: RpcEnv already stopped.
17/11/14 14:45:49 INFO YarnAllocator: Completed container container_e60_1510219626098_0394_01_000008 on host: AUPER01-02-30-12-1.prod.vroc.com.au (state: COMPLETE, exit status: 0)
17/11/14 14:45:49 INFO YarnAllocator: Executor for container container_e60_1510219626098_0394_01_000008 exited because of a YARN event (e.g., pre-emption) and not because of an error in the running job.

我正在HDP2.6.3上运行Spark 1.6.3作业,这可能是由于livy libs的版本不兼容问题造成的。 您可以下载与hdp版本相匹配的livy api库的正确版本。 参考:

Libs下载链接:

这方面有什么新消息吗?我遇到了完全相同的问题,在将集群升级到HDP2.6.3之后,我还尝试连接到Spark 1.6.3!此兼容性矩阵显示HDP 2.6.3和Spark 1.6.3与Livy 0.3.0兼容,但Livy 0.4.0是通过HDP升级安装的:这可能是问题的原因吗?很高兴听到这个消息!