Apache Spark导致Tomcat正常关闭

Apache Spark导致Tomcat正常关闭,tomcat,apache-spark,tomcat7,Tomcat,Apache Spark,Tomcat7,Tomcat版本:7.0.47 我有一个使用ApacheSpark的web应用程序。我的web应用程序充当Apache spark驱动程序 当远程独立spark群集不可用时,spark上下文将关闭,日志名为org.apache.spark.util.Utils-Shutdown hook 这种情况发生得越早,Tomcat也开始优雅地关闭。我能在tomcat中看到的唯一日志是[exec]Result:50 当spark调用它的关闭钩子时,Tomcat关闭的原因是什么 火花测井 SLF4J: Cla

Tomcat版本:
7.0.47

我有一个使用ApacheSpark的web应用程序。我的web应用程序充当Apache spark驱动程序

当远程独立spark群集不可用时,spark上下文将关闭,日志名为org.apache.spark.util.Utils-Shutdown hook

这种情况发生得越早,Tomcat也开始优雅地关闭。我能在tomcat中看到的唯一日志是
[exec]Result:50

当spark调用它的关闭钩子时,Tomcat关闭的原因是什么

火花测井

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/data/downloads/spark-1.4.1-bin-hadoop2.6/lib/spark-assembly-1.4.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/data/downloads/spark-1.4.1-bin-hadoop2.6/lib/spark-examples-1.4.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/12/09 17:11:23 INFO SparkContext: Running Spark version 1.4.1
15/12/09 17:11:24 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/12/09 17:11:24 WARN Utils: Your hostname, pesamara-mobl-vm1 resolves to a loopback address: 127.0.0.1; using 10.30.9.107 instead (on interface eth0)
15/12/09 17:11:24 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
15/12/09 17:11:25 INFO SecurityManager: Changing view acls to: pes
15/12/09 17:11:25 INFO SecurityManager: Changing modify acls to: pes
15/12/09 17:11:25 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(pes); users with modify permissions: Set(pes)
15/12/09 17:11:26 INFO Slf4jLogger: Slf4jLogger started
15/12/09 17:11:27 INFO Remoting: Starting remoting
15/12/09 17:11:27 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@10.30.9.107:55740]
15/12/09 17:11:27 INFO Utils: Successfully started service 'sparkDriver' on port 55740.
15/12/09 17:11:27 INFO SparkEnv: Registering MapOutputTracker
15/12/09 17:11:27 INFO SparkEnv: Registering BlockManagerMaster
15/12/09 17:11:27 INFO DiskBlockManager: Created local directory at /tmp/spark-30d61b03-0b1c-4250-b68e-c2404c7884a8/blockmgr-3226ed7e-f8e5-40a2-bfb1-ffabb51cd0e0
15/12/09 17:11:28 INFO MemoryStore: MemoryStore started with capacity 491.5 MB
15/12/09 17:11:28 INFO HttpFileServer: HTTP File server directory is /tmp/spark-30d61b03-0b1c-4250-b68e-c2404c7884a8/httpd-7f2572c2-5677-446e-a80a-6f9d05ee2891
15/12/09 17:11:28 INFO HttpServer: Starting HTTP Server
15/12/09 17:11:28 INFO Utils: Successfully started service 'HTTP file server' on port 45047.
15/12/09 17:11:28 INFO SparkEnv: Registering OutputCommitCoordinator
15/12/09 17:11:28 INFO Utils: Successfully started service 'SparkUI' on port 4040.
15/12/09 17:11:28 INFO SparkUI: Started SparkUI at http://10.30.9.107:4040
15/12/09 17:11:29 INFO FairSchedulableBuilder: Created default pool default, schedulingMode: FIFO, minShare: 0, weight: 1
15/12/09 17:11:29 INFO AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@localhost2:7077/user/Master...
15/12/09 17:11:29 WARN AppClient$ClientActor: Could not connect to akka.tcp://sparkMaster@localhost2:7077: akka.remote.InvalidAssociation: Invalid address: akka.tcp://sparkMaster@localhost2:7077
15/12/09 17:11:29 WARN Remoting: Tried to associate with unreachable remote address [akka.tcp://sparkMaster@localhost2:7077]. Address is now gated for 5000 ms, all messages to this address will be delivered to dead letters. Reason: localhost2: unknown error
15/12/09 17:11:49 INFO AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@localhost2:7077/user/Master...
15/12/09 17:11:49 WARN AppClient$ClientActor: Could not connect to akka.tcp://sparkMaster@localhost2:7077: akka.remote.InvalidAssociation: Invalid address: akka.tcp://sparkMaster@localhost2:7077
15/12/09 17:11:49 WARN Remoting: Tried to associate with unreachable remote address [akka.tcp://sparkMaster@localhost2:7077]. Address is now gated for 5000 ms, all messages to this address will be delivered to dead letters. Reason: localhost2: unknown error
15/12/09 17:12:09 INFO AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@localhost2:7077/user/Master...
15/12/09 17:12:09 WARN AppClient$ClientActor: Could not connect to akka.tcp://sparkMaster@localhost2:7077: akka.remote.InvalidAssociation: Invalid address: akka.tcp://sparkMaster@localhost2:7077
15/12/09 17:12:09 WARN Remoting: Tried to associate with unreachable remote address [akka.tcp://sparkMaster@localhost2:7077]. Address is now gated for 5000 ms, all messages to this address will be delivered to dead letters. Reason: localhost2: unknown error
15/12/09 17:12:29 ERROR SparkDeploySchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up.
15/12/09 17:12:29 WARN SparkDeploySchedulerBackend: Application ID is not initialized yet.
15/12/09 17:12:29 INFO SparkUI: Stopped Spark web UI at http://10.30.9.107:4040
15/12/09 17:12:29 INFO DAGScheduler: Stopping DAGScheduler
15/12/09 17:12:29 INFO SparkDeploySchedulerBackend: Shutting down all executors
15/12/09 17:12:29 INFO SparkDeploySchedulerBackend: Asking each executor to shut down
15/12/09 17:12:29 ERROR OneForOneStrategy: 
java.lang.NullPointerException
    at org.apache.spark.deploy.client.AppClient$ClientActor$$anonfun$receiveWithLogging$1.applyOrElse(AppClient.scala:160)
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
    at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:59)
    at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:42)
    at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
    at org.apache.spark.util.ActorLogReceive$$anon$1.applyOrElse(ActorLogReceive.scala:42)
    at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
    at org.apache.spark.deploy.client.AppClient$ClientActor.aroundReceive(AppClient.scala:61)
    at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
    at akka.actor.ActorCell.invoke(ActorCell.scala:487)
    at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
    at akka.dispatch.Mailbox.run(Mailbox.scala:220)
    at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
15/12/09 17:12:29 INFO AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@localhost2:7077/user/Master...
15/12/09 17:12:29 WARN AppClient$ClientActor: Could not connect to akka.tcp://sparkMaster@localhost2:7077: akka.remote.InvalidAssociation: Invalid address: akka.tcp://sparkMaster@localhost2:7077
15/12/09 17:12:29 WARN Remoting: Tried to associate with unreachable remote address [akka.tcp://sparkMaster@localhost2:7077]. Address is now gated for 5000 ms, all messages to this address will be delivered to dead letters. Reason: localhost2: unknown error
15/12/09 17:12:29 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 54184.
15/12/09 17:12:29 INFO NettyBlockTransferService: Server created on 54184
15/12/09 17:12:29 INFO BlockManagerMaster: Trying to register BlockManager
15/12/09 17:12:29 INFO BlockManagerMasterEndpoint: Registering block manager 10.30.9.107:54184 with 491.5 MB RAM, BlockManagerId(driver, 10.30.9.107, 54184)
15/12/09 17:12:29 INFO BlockManagerMaster: Registered BlockManager
15/12/09 17:12:30 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext
    at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:103)
    at org.apache.spark.SparkContext.getSchedulingMode(SparkContext.scala:1503)
    at org.apache.spark.SparkContext.postEnvironmentUpdate(SparkContext.scala:2007)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:543)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
    at org.apache.spark.examples.sql.SparkContextTest.main(SparkContextTest.java:32)
15/12/09 17:12:30 INFO SparkContext: SparkContext already stopped.
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext
    at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:103)
    at org.apache.spark.SparkContext.getSchedulingMode(SparkContext.scala:1503)
    at org.apache.spark.SparkContext.postEnvironmentUpdate(SparkContext.scala:2007)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:543)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
    at org.apache.spark.examples.sql.SparkContextTest.main(SparkContextTest.java:32)
15/12/09 17:12:30 INFO DiskBlockManager: Shutdown hook called
15/12/09 17:12:30 INFO Utils: path = /tmp/spark-30d61b03-0b1c-4250-b68e-c2404c7884a8/blockmgr-3226ed7e-f8e5-40a2-bfb1-ffabb51cd0e0, already present as root for deletion.
15/12/09 17:12:30 INFO Utils: Shutdown hook called
15/12/09 17:12:30 INFO Utils: Deleting directory /tmp/spark-30d61b03-0b1c-4250-b68e-c2404c7884a8/httpd-7f2572c2-5677-446e-a80a-6f9d05ee2891
15/12/09 17:12:30 INFO Utils: Deleting directory /tmp/spark-30d61b03-0b1c-4250-b68e-c2404c7884a8
SLF4J:类路径包含多个SLF4J绑定。
SLF4J:在[jar:file:/data/downloads/spark-1.4.1-bin-hadoop2.6/lib/spark-assembly-1.4.1-hadoop2.6.0.jar!/org/SLF4J/impl/StaticLoggerBinder.class]中找到绑定
SLF4J:在[jar:file:/data/downloads/spark-1.4.1-bin-hadoop2.6/lib/spark-examples-1.4.1-hadoop2.6.0.jar!/org/SLF4J/impl/StaticLoggerBinder.class]中找到绑定
SLF4J:参见http://www.slf4j.org/codes.html#multiple_bindings 我需要一个解释。
SLF4J:实际绑定的类型为[org.SLF4J.impl.Log4jLoggerFactory]
使用Spark的默认log4j配置文件:org/apache/Spark/log4j-defaults.properties
15/12/09 17:11:23信息SparkContext:运行Spark版本1.4.1
15/12/09 17:11:24警告NativeCodeLoader:无法为您的平台加载本机hadoop库。。。在适用的情况下使用内置java类
15/12/09 17:11:24警告Utils:您的主机名pesamara-mobl-vm1解析为环回地址:127.0.0.1;改用10.30.9.107(在接口eth0上)
15/12/09 17:11:24警告Utils:如果需要绑定到其他地址,请设置SPARK\u LOCAL\u IP
15/12/09 17:11:25信息安全管理器:将视图ACL更改为:pes
15/12/09 17:11:25信息安全管理器:将修改ACL更改为:pes
15/12/09 17:11:25信息安全管理器:安全管理器:身份验证已禁用;ui ACL被禁用;具有查看权限的用户:设置(pes);具有修改权限的用户:设置(pes)
15/12/09 17:11:26信息Slf4jLogger:Slf4jLogger已启动
15/12/09 17:11:27信息远程处理:开始远程处理
15/12/09 17:11:27信息远程处理:远程处理已开始;收听地址:[阿克卡。tcp://sparkDriver@10.30.9.107:55740]
15/12/09 17:11:27信息提示:已成功启动端口55740上的服务“sparkDriver”。
15/12/09 17:11:27信息SparkEnv:注册MapOutputRacker
15/12/09 17:11:27信息SparkEnv:注册BlockManagerMaster
2009年12月15日17:11:27信息DiskBlockManager:已在/tmp/spark-30d61b03-0b1c-4250-b68e-c2404c7884a8/blockmgr-3226ed7e-f8e5-40a2-bfb1-FFAB51CD0E0创建本地目录
15/12/09 17:11:28信息MemoryStore:MemoryStore已启动,容量为491.5 MB
2009年12月15日17:11:28信息HttpFileServer:HTTP文件服务器目录为/tmp/spark-30d61b03-0b1c-4250-b68e-c2404c7884a8/httpd-7f2572c2-5677-446e-a80a-6f9d05ee2891
15/12/09 17:11:28信息HttpServer:正在启动HTTP服务器
15/12/09 17:11:28信息实用程序:已在端口45047上成功启动服务“HTTP文件服务器”。
15/12/09 17:11:28信息SparkEnv:正在注册OutputCommitCoordinator
15/12/09 17:11:28信息提示:已成功启动端口4040上的服务“SparkUI”。
15/12/09 17:11:28信息斯巴库伊:斯巴库伊于http://10.30.9.107:4040
2009年12月15日17:11:29信息FairSchedulableBuilder:已创建默认池默认,调度模式:FIFO,分钟共享:0,权重:1
15/12/09 17:11:29信息应用客户端$ClientActor:正在连接到akka主机。tcp://sparkMaster@localhost2:7077/user/Master。。。
2009年12月15日17:11:29警告AppClient$ClientActor:无法连接到akka。tcp://sparkMaster@localhost2:7077:akka.remote.InvalidAssociation:无效地址:akka。tcp://sparkMaster@localhost2:7077
15/12/09 17:11:29警告远程处理:尝试与无法访问的远程地址[akka]关联。tcp://sparkMaster@localhost2:7077]。该地址现在被选通5000毫秒,所有发送到该地址的消息将以死信的形式发送。原因:localhost2:未知错误
2009年12月15日17:11:49信息应用程序客户端$ClientActor:正在连接到akka主机。tcp://sparkMaster@localhost2:7077/user/Master。。。
2009年12月15日17:11:49警告AppClient$ClientActor:无法连接到akka。tcp://sparkMaster@localhost2:7077:akka.remote.InvalidAssociation:无效地址:akka。tcp://sparkMaster@localhost2:7077
15/12/09 17:11:49警告远程处理:尝试与无法访问的远程地址[akka]关联。tcp://sparkMaster@localhost2:7077]。该地址现在被选通5000毫秒,所有发送到该地址的消息将以死信的形式发送。原因:localhost2:未知错误
15/12/09 17:12:09信息AppClient$ClientActor:正在连接到akka主机。tcp://sparkMaster@localhost2:7077/user/Master。。。
15/12/09 17:12:09警告AppClient$ClientActor:无法连接到akka。tcp://sparkMaster@localhost2:7077:akka.remote.InvalidAssociation:无效地址:akka。tcp://sparkMaster@localhost2:7077
15/12/09 17:12:09警告远程处理:尝试与无法访问的远程地址[akka]关联。tcp://sparkMaster@localhost2:7077]。该地址现在被选通5000毫秒,所有发送到该地址的消息将以死信的形式发送。原因:localhost2:未知错误
15/12/09 17:12:29错误SparkDeploySchedulerBackend:应用程序已被终止。原因:所有的主人都没有反应!放弃。
15/12/09 17:12:29警告SparkDeploySchedulerBackend:应用程序ID尚未初始化。
15/12/09 17:12:29信息SparkUI:已停止Spark web UIhttp://10.30.9.107:4040
15/12/09 17:12:29信息DAGScheduler:正在停止DAGScheduler
15/12/09 17:12:29信息SparkDeploySchedulerBackend:关闭所有执行器
15/12/09 17:12:29信息SparkDeploySchedulerBackend:要求每个执行者关闭
2009年12月15日17:12:29错误一对一策略:
java.lang.NullPointerException
位于org.apache.spark.deploy.client.AppClient$ClientActor$$anonfun$receiveWithLogging$1.applyOrElse(AppClient.scala:160)
在scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
在scala.runtime.AbstractPartialFunction$mcVL$sp.apply处(AbstractPartialFunction.scala:33)
在scala.runtime。