Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Hadoop 独立Spark群集中的工作进程上的反序列化错误_Hadoop_Serialization_Apache Spark - Fatal编程技术网

Hadoop 独立Spark群集中的工作进程上的反序列化错误

Hadoop 独立Spark群集中的工作进程上的反序列化错误,hadoop,serialization,apache-spark,Hadoop,Serialization,Apache Spark,我有一个spark应用程序,它在一个独立的spark群集上运行良好,当spark群集在我的笔记本电脑上运行时,它运行正常 master和一个worker,但当我尝试在独立的Spark群集上运行时失败 部署在EC2上的主服务器和辅助服务器位于不同的计算机上。 应用程序结构如下所示: 有一个java进程“消息处理器”,它与 火花大师。当它启动时,它会将自己提交给Spark master,然后, 它侦听SQS和每个接收到的消息,它应该运行spark作业来处理来自S3的文件,该地址在消息中配置。 在Sp

我有一个spark应用程序,它在一个独立的spark群集上运行良好,当spark群集在我的笔记本电脑上运行时,它运行正常 master和一个worker,但当我尝试在独立的Spark群集上运行时失败 部署在EC2上的主服务器和辅助服务器位于不同的计算机上。 应用程序结构如下所示: 有一个java进程“消息处理器”,它与 火花大师。当它启动时,它会将自己提交给Spark master,然后, 它侦听SQS和每个接收到的消息,它应该运行spark作业来处理来自S3的文件,该地址在消息中配置。 在Spark驱动程序尝试发送作业时,这一切似乎都失败了 给火花执行器。 下面是配置SparkContext的“消息处理器”的代码, 然后是Spark驱动程序日志,然后是Spark执行器日志。 我的代码的输出和一些要点用粗体和粗体标记 为了可读性,我在一些地方简化了代码和日志。 非常感谢你的帮助,因为我对这个问题已经没有什么想法了

“消息处理器”代码: logger.info已在测试模式下启动集成中心SubmitDriver

SparkConf sparkConf = new SparkConf() 
.setMaster(SPARK_MASTER_URI) 
.setAppName(APPLICATION_NAME) 
.setSparkHome(SPARK_LOCATION_ON_EC2_MACHINE); 

sparkConf.setJars(JavaSparkContext.jarOfClass(this.getClass())); 

// configure spark executor to use log4j properties located in the local spark conf dir 
sparkConf.set("spark.executor.extraJavaOptions", "-XX:+UseConcMarkSweepGC -Dlog4j.configuration=log4j_integrationhub_sparkexecutor.properties"); 

sparkConf.set("spark.executor.memory", "1g"); 
sparkConf.set("spark.cores.max", "3"); 
// Spill shuffle to disk to avoid OutOfMemory, at cost of reduced performance 
sparkConf.set("spark.shuffle.spill", "true"); 

logger.info("Connecting Spark"); 
JavaSparkContext sc = new JavaSparkContext(sparkConf); 

sc.hadoopConfiguration().set("fs.s3n.awsAccessKeyId", AWS_KEY); 
sc.hadoopConfiguration().set("fs.s3n.awsSecretAccessKey", AWS_SECRET); 

logger.info("Spark connected"); 
驱动程序日志:

2015-05-01 07:47:14 INFO  ClassPathBeanDefinitionScanner:239 - JSR-330 'javax.inject.Named' annotation found and supported for component scanning 
2015-05-01 07:47:14 INFO  AnnotationConfigApplicationContext:510 - Refreshing org.springframework.context.annotation.AnnotationConfigApplicationContext@5540b23b: startup date [Fri May 01 07:47:14 UTC 2015]; root of context hierarchy 
2015-05-01 07:47:14 INFO  AutowiredAnnotationBeanPostProcessor:140 - JSR-330 'javax.inject.Inject' annotation found and supported for autowiring 
2015-05-01 07:47:14 INFO  DefaultListableBeanFactory:596 - Pre-instantiating singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@13f948e: defining beans [org.springframework.context.annotation.internalConfigurationAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor,org.springframework.context.annotation.internalCommonAnnotationProcessor,integrationHubConfig,org.springframework.context.annotation.ConfigurationClassPostProcessor.importAwareProcessor,processorInlineDriver,s3Accessor,cdFetchUtil,httpUtil,cdPushUtil,submitDriver,databaseLogger,connectorUtil,totangoDataValidations,environmentConfig,sesUtil,processorExecutor,processorDriver]; root of factory hierarchy 
2015-05-01 07:47:15 INFO  SubmitDriver:69 - Started Integration Hub SubmitDriver in test mode. 
2015-05-01 07:47:15 INFO  SubmitDriver:101 - Connecting Spark 
2015-05-01 07:47:15 INFO  SparkContext:59 - Running Spark version 1.3.0 
2015-05-01 07:47:16 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
2015-05-01 07:47:16 INFO  SecurityManager:59 - Changing view acls to: hadoop 
2015-05-01 07:47:16 INFO  SecurityManager:59 - Changing modify acls to: hadoop 
2015-05-01 07:47:16 INFO  SecurityManager:59 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop) 
2015-05-01 07:47:18 INFO  Slf4jLogger:80 - Slf4jLogger started 
2015-05-01 07:47:18 INFO  Remoting:74 - Starting remoting 
2015-05-01 07:47:18 INFO  Remoting:74 - Remoting started; listening on addresses :[akka.tcp://sparkDriver@sparkMasterIp:39176] 
2015-05-01 07:47:18 INFO  Utils:59 - Successfully started service 'sparkDriver' on port 39176. 
2015-05-01 07:47:18 INFO  SparkEnv:59 - Registering MapOutputTracker 
2015-05-01 07:47:18 INFO  SparkEnv:59 - Registering BlockManagerMaster 
2015-05-01 07:47:18 INFO  HttpFileServer:59 - HTTP File server directory is /tmp/spark-e4726219-5708-48c9-8377-c103ad1e7a75/httpd-fe68500f-01b1-4241-a3a2-3b4cf8394daf 
2015-05-01 07:47:18 INFO  HttpServer:59 - Starting HTTP Server 
2015-05-01 07:47:19 INFO  Server:272 - jetty-8.y.z-SNAPSHOT 
2015-05-01 07:47:19 INFO  AbstractConnector:338 - Started SocketConnector@0.0.0.0:47166 
2015-05-01 07:47:19 INFO  Utils:59 - Successfully started service 'HTTP file server' on port 47166. 
2015-05-01 07:47:19 INFO  SparkEnv:59 - Registering OutputCommitCoordinator 
2015-05-01 07:47:24 INFO  Server:272 - jetty-8.y.z-SNAPSHOT 
2015-05-01 07:47:24 INFO  AbstractConnector:338 - Started SelectChannelConnector@0.0.0.0:4040 
2015-05-01 07:47:24 INFO  Utils:59 - Successfully started service 'SparkUI' on port 4040. 
2015-05-01 07:47:24 INFO  SparkUI:59 - Started SparkUI at http://sparkMasterIp:4040
2015-05-01 07:47:24 INFO  SparkContext:59 - Added JAR /rev/8fcc3a5/integhub_be/genconn/lib/genconn-8fcc3a5.jar at http://sparkMasterIp:47166/jars/genconn-8fcc3a5.jar with timestamp 1430466444838 
2015-05-01 07:47:24 INFO  AppClient$ClientActor:59 - Connecting to master akka.tcp://sparkMaster@sparkMasterIp:7077/user/Master... 
2015-05-01 07:47:25 INFO  AppClient$ClientActor:59 - Executor added: app-20150501074725-0005/0 on worker-20150430140019-ip-sparkWorkerIp-38610 (sparkWorkerIp:38610) with 1 cores 
2015-05-01 07:47:25 INFO  AppClient$ClientActor:59 - Executor updated: app-20150501074725-0005/0 is now LOADING 
2015-05-01 07:47:25 INFO  AppClient$ClientActor:59 - Executor updated: app-20150501074725-0005/0 is now RUNNING 
2015-05-01 07:47:25 INFO  NettyBlockTransferService:59 - Server created on 34024 
2015-05-01 07:47:26 INFO  SubmitDriver:116 - Spark connected 
2015-05-01 07:47:26 INFO  SubmitDriver:125 - Connected to SQS... Listening on https://sqsAddress
2015-05-01 07:51:39 INFO  SubmitDriver:130 - Polling Message queue... 
2015-05-01 07:51:47 INFO  SubmitDriver:148 - Received Message : {someMessage} 
2015-05-01 07:51:47 INFO  SubmitDriver:158 - Process Input JSON 
2015-05-01 07:51:50 INFO  SparkContext:59 - Created broadcast 0 from textFile at ProcessorDriver.java:208 
2015-05-01 07:51:52 INFO  FileInputFormat:253 - Total input paths to process : 1 
2015-05-01 07:51:52 INFO  SparkContext:59 - Starting job: first at ConnectorUtil.java:605 
2015-05-01 07:51:52 INFO  SparkContext:59 - Created broadcast 1 from broadcast at DAGScheduler.scala:839 
2015-05-01 07:51:52 WARN  TaskSetManager:71 - ... *the warning will be repeated as error below*
2015-05-01 07:51:52 ERROR TaskSetManager:75 - Task 0 in stage 0.0 failed 4 times; aborting job 
2015-05-01 07:51:52 ERROR ProcessorDriver:261 - Error executing the batch Operation.. 
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, sparkWorkerIp): java.io.EOFException 
        at java.io.ObjectInputStream$BlockDataInputStream.readFully(ObjectInputStream.java:2744) 
        at java.io.ObjectInputStream.readFully(ObjectInputStream.java:1032) 
        at org.apache.hadoop.io.DataOutputBuffer$Buffer.write(DataOutputBuffer.java:63) 
        at org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:101) 
        at org.apache.hadoop.io.UTF8.readChars(UTF8.java:216) 
        at org.apache.hadoop.io.UTF8.readString(UTF8.java:208) 
        at org.apache.hadoop.mapred.FileSplit.readFields(FileSplit.java:87) 
        at org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:237) 
        at org.apache.hadoop.io.ObjectWritable.readFields(ObjectWritable.java:66) 
        at org.apache.spark.SerializableWritable$$anonfun$readObject$1.apply$mcV$sp(SerializableWritable.scala:43) 
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1137) 
        at org.apache.spark.SerializableWritable.readObject(SerializableWritable.scala:39) 
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
        at java.lang.reflect.Method.invoke(Method.java:606) 
        at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017) 
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893) 
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) 
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) 
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) 
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) 
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) 
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) 
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) 
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) 
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) 
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) 
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370) 
        at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:68) 
        at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:94) 
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:185) 
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
        at java.lang.Thread.run(Thread.java:745) 

Driver stacktrace: ...
工作日志:

2015-05-01 07:47:26 INFO  CoarseGrainedExecutorBackend:47 - Registered signal handlers for [TERM, HUP, INT] 
2015-05-01 07:47:26 DEBUG Configuration:227 - java.io.IOException: config() 
        at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:227) 
        at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:214) 
        at org.apache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHadoopUtil.scala:78) 
        at org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:43) 
        at org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:220) 
        at org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala) 
        at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:128) 
        at org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:224) 
        at org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala) 

2015-05-01 07:47:26 DEBUG Groups:139 -  Creating new Groups object 
2015-05-01 07:47:27 DEBUG Groups:59 - Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping; cacheTimeout=300000 
2015-05-01 07:47:27 DEBUG Configuration:227 - java.io.IOException: config() 
        at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:227) 
        at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:214) 
        at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:184) 
        at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:236) 
        at org.apache.hadoop.security.KerberosName.<clinit>(KerberosName.java:79) 
        at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:209) 
        at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:226) 
        at org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:44) 
        at org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:220) 
        at org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala) 
        at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:128) 
        at org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:224) 
        at org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala) 

2015-05-01 07:47:27 DEBUG SparkHadoopUtil:63 - running as user: hadoop 
2015-05-01 07:47:27 DEBUG UserGroupInformation:146 - hadoop login 
2015-05-01 07:47:27 DEBUG UserGroupInformation:95 - hadoop login commit 
2015-05-01 07:47:27 DEBUG UserGroupInformation:125 - using local user:UnixPrincipal: root 
2015-05-01 07:47:27 DEBUG UserGroupInformation:493 - UGI loginUser:root 
2015-05-01 07:47:27 DEBUG UserGroupInformation:1143 - PriviledgedAction as:hadoop from:org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:59) 
2015-05-01 07:47:27 INFO  SecurityManager:59 - Changing view acls to: root,hadoop 
2015-05-01 07:47:27 INFO  SecurityManager:59 - Changing modify acls to: root,hadoop 
2015-05-01 07:47:27 INFO  SecurityManager:59 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root, hadoop); users with modify permissions: Set(root, hadoop) 
2015-05-01 07:47:27 DEBUG SecurityManager:63 - SSLConfiguration for file server: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()} 
2015-05-01 07:47:27 DEBUG SecurityManager:63 - SSLConfiguration for Akka: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()} 
2015-05-01 07:47:27 DEBUG AkkaUtils:63 - In createActorSystem, requireCookie is: off 
2015-05-01 07:47:28 INFO  Slf4jLogger:80 - Slf4jLogger started 
2015-05-01 07:47:28 INFO  Remoting:74 - Starting remoting 
2015-05-01 07:47:29 INFO  Remoting:74 - Remoting started; listening on addresses :[akka.tcp://driverPropsFetcher@sparkWorkerIp:49741] 
2015-05-01 07:47:29 INFO  Utils:59 - Successfully started service 'driverPropsFetcher' on port 49741. 
2015-05-01 07:47:29 INFO  RemoteActorRefProvider$RemotingTerminator:74 - Shutting down remote daemon. 
2015-05-01 07:47:29 INFO  RemoteActorRefProvider$RemotingTerminator:74 - Remote daemon shut down; proceeding with flushing remote transports. 
2015-05-01 07:47:29 INFO  SecurityManager:59 - Changing view acls to: root,hadoop 
2015-05-01 07:47:29 INFO  SecurityManager:59 - Changing modify acls to: root,hadoop 
2015-05-01 07:47:29 INFO  SecurityManager:59 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root, hadoop); users with modify permissions: Set(root, hadoop) 
2015-05-01 07:47:29 DEBUG SecurityManager:63 - SSLConfiguration for file server: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()} 
2015-05-01 07:47:29 DEBUG SecurityManager:63 - SSLConfiguration for Akka: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()} 
2015-05-01 07:47:29 DEBUG AkkaUtils:63 - In createActorSystem, requireCookie is: off 
2015-05-01 07:47:29 INFO  RemoteActorRefProvider$RemotingTerminator:74 - Remoting shut down. 
2015-05-01 07:47:29 INFO  Slf4jLogger:80 - Slf4jLogger started 
2015-05-01 07:47:29 INFO  Remoting:74 - Starting remoting 
2015-05-01 07:47:29 INFO  Remoting:74 - Remoting started; listening on addresses :[akka.tcp://sparkExecutor@ sparkWorkerIp:45299] 
2015-05-01 07:47:29 INFO  Utils:59 - Successfully started service 'sparkExecutor' on port 45299. 
2015-05-01 07:47:29 DEBUG SparkEnv:63 - Using serializer: class org.apache.spark.serializer.JavaSerializer 
2015-05-01 07:47:29 INFO  AkkaUtils:59 - Connecting to MapOutputTracker: akka.tcp://sparkDriver@ sparkMasterIp:39176/user/MapOutputTracker 
2015-05-01 07:47:30 INFO  AkkaUtils:59 - Connecting to BlockManagerMaster: akka.tcp://sparkDriver@sparkMasterIp:39176/user/BlockManagerMaster 
2015-05-01 07:47:30 INFO  DiskBlockManager:59 - Created local directory at /mnt/spark/spark-d745cbac-d1cc-47ee-9eba-e99e104732d5/spark-e3963fa3-cab6-4c69-8e78-d23246250a5d/spark-6f1a9653-86fd-401f-bf37-6eca5b6c0adf/blockmgr-ee0e9452-4111-42d0-ab5e-e66317052e4b 
2015-05-01 07:47:30 INFO  MemoryStore:59 - MemoryStore started with capacity 548.5 MB 
2015-05-01 07:47:30 INFO  AkkaUtils:59 - Connecting to OutputCommitCoordinator: akka.tcp://sparkDriver@ sparkMasterIp:39176/user/OutputCommitCoordinator 
2015-05-01 07:47:30 INFO  CoarseGrainedExecutorBackend:59 - Connecting to driver: akka.tcp://sparkDriver@ sparkMasterIp:39176/user/CoarseGrainedScheduler 
2015-05-01 07:47:30 INFO  WorkerWatcher:59 - Connecting to worker akka.tcp://sparkWorker@sparkWorkerIp:38610/user/Worker 
2015-05-01 07:47:30 DEBUG WorkerWatcher:50 - [actor] received message Associated [akka.tcp://sparkExecutor@ sparkWorkerIp:45299] -> [akka.tcp://sparkWorker@ sparkWorkerIp:38610] from Actor[akka://sparkExecutor/deadLetters] 
2015-05-01 07:47:30 INFO  WorkerWatcher:59 - Successfully connected to akka.tcp://sparkWorker@ sparkWorkerIp:38610/user/Worker 
2015-05-01 07:47:30 DEBUG WorkerWatcher:56 - [actor] handled message (1.18794 ms) Associated [akka.tcp://sparkExecutor@ sparkWorkerIp:45299] -> [akka.tcp://sparkWorker@ sparkWorkerIp:38610] from Actor[akka://sparkExecutor/deadLetters] 
2015-05-01 07:47:30 DEBUG CoarseGrainedExecutorBackend:50 - [actor] received message RegisteredExecutor from Actor[akka.tcp://sparkDriver@ sparkMasterIp:39176/user/CoarseGrainedScheduler#-970636338] 
2015-05-01 07:47:30 INFO  CoarseGrainedExecutorBackend:59 - Successfully registered with driver 
2015-05-01 07:47:30 INFO  Executor:59 - Starting executor ID 0 on host sparkWorkerIp 
2015-05-01 07:47:30 DEBUG InternalLoggerFactory:71 - Using SLF4J as the default logging framework 
2015-05-01 07:47:30 DEBUG PlatformDependent0:76 - java.nio.Buffer.address: available 
2015-05-01 07:47:30 DEBUG PlatformDependent0:76 - sun.misc.Unsafe.theUnsafe: available 
2015-05-01 07:47:30 DEBUG PlatformDependent0:71 - sun.misc.Unsafe.copyMemory: available 
2015-05-01 07:47:30 DEBUG PlatformDependent0:76 - java.nio.Bits.unaligned: true 
2015-05-01 07:47:30 DEBUG PlatformDependent:76 - UID: 0 
2015-05-01 07:47:30 DEBUG PlatformDependent:76 - Java version: 7 
2015-05-01 07:47:30 DEBUG PlatformDependent:76 - -Dio.netty.noUnsafe: false 
2015-05-01 07:47:30 DEBUG PlatformDependent:76 - sun.misc.Unsafe: available 
2015-05-01 07:47:30 DEBUG PlatformDependent:76 - -Dio.netty.noJavassist: false 
2015-05-01 07:47:30 DEBUG PlatformDependent:71 - Javassist: unavailable 
2015-05-01 07:47:30 DEBUG PlatformDependent:71 - You don't have Javassist in your class path or you don't have enough permission to load dynamically generated classes.  Please check the configuration for better performance. 
2015-05-01 07:47:30 DEBUG PlatformDependent:76 - -Dio.netty.tmpdir: /tmp (java.io.tmpdir) 
2015-05-01 07:47:30 DEBUG PlatformDependent:76 - -Dio.netty.bitMode: 64 (sun.arch.data.model) 
2015-05-01 07:47:30 DEBUG PlatformDependent:76 - -Dio.netty.noPreferDirect: false 
2015-05-01 07:47:30 DEBUG MultithreadEventLoopGroup:76 - -Dio.netty.eventLoopThreads: 2 
2015-05-01 07:47:30 DEBUG NioEventLoop:76 - -Dio.netty.noKeySetOptimization: false 
2015-05-01 07:47:30 DEBUG NioEventLoop:76 - -Dio.netty.selectorAutoRebuildThreshold: 512 
2015-05-01 07:47:30 DEBUG PooledByteBufAllocator:76 - -Dio.netty.allocator.numHeapArenas: 1 
2015-05-01 07:47:30 DEBUG PooledByteBufAllocator:76 - -Dio.netty.allocator.numDirectArenas: 1 
2015-05-01 07:47:30 DEBUG PooledByteBufAllocator:76 - -Dio.netty.allocator.pageSize: 8192 
2015-05-01 07:47:30 DEBUG PooledByteBufAllocator:76 - -Dio.netty.allocator.maxOrder: 11 
2015-05-01 07:47:30 DEBUG PooledByteBufAllocator:76 - -Dio.netty.allocator.chunkSize: 16777216 
2015-05-01 07:47:30 DEBUG PooledByteBufAllocator:76 - -Dio.netty.allocator.tinyCacheSize: 512 
2015-05-01 07:47:30 DEBUG PooledByteBufAllocator:76 - -Dio.netty.allocator.smallCacheSize: 256 
2015-05-01 07:47:30 DEBUG PooledByteBufAllocator:76 - -Dio.netty.allocator.normalCacheSize: 64 
2015-05-01 07:47:30 DEBUG PooledByteBufAllocator:76 - -Dio.netty.allocator.maxCachedBufferCapacity: 32768 
2015-05-01 07:47:30 DEBUG PooledByteBufAllocator:76 - -Dio.netty.allocator.cacheTrimInterval: 8192 
2015-05-01 07:47:30 DEBUG ThreadLocalRandom:71 - -Dio.netty.initialSeedUniquifier: 0x4ac460da6a283b82 (took 1 ms) 
2015-05-01 07:47:31 DEBUG ByteBufUtil:76 - -Dio.netty.allocator.type: unpooled 
2015-05-01 07:47:31 DEBUG ByteBufUtil:76 - -Dio.netty.threadLocalDirectBufferSize: 65536 
2015-05-01 07:47:31 DEBUG NetUtil:86 - Loopback interface: lo (lo, 0:0:0:0:0:0:0:1%1) 
2015-05-01 07:47:31 DEBUG NetUtil:81 - /proc/sys/net/core/somaxconn: 128 
2015-05-01 07:47:31 DEBUG TransportServer:106 - Shuffle server started on port :46839 
2015-05-01 07:47:31 INFO  NettyBlockTransferService:59 - Server created on 46839 
2015-05-01 07:47:31 INFO  BlockManagerMaster:59 - Trying to register BlockManager 
2015-05-01 07:47:31 INFO  BlockManagerMaster:59 - Registered BlockManager 
2015-05-01 07:47:31 INFO  AkkaUtils:59 - Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@ sparkMasterIp:39176/user/HeartbeatReceiver 
2015-05-01 07:47:31 DEBUG CoarseGrainedExecutorBackend:56 - [actor] handled message (339.232401 ms) RegisteredExecutor from Actor[akka.tcp://sparkDriver@ sparkMasterIp:39176/user/CoarseGrainedScheduler#-970636338] 
2015-05-01 07:51:52 DEBUG CoarseGrainedExecutorBackend:50 - [actor] received message LaunchTask(org.apache.spark.util.SerializableBuffer@608752bf) from Actor[akka.tcp://sparkDriver@ sparkMasterIp:39176/user/CoarseGrainedScheduler#-970636338] 
2015-05-01 07:51:52 INFO  CoarseGrainedExecutorBackend:59 - Got assigned task 0 
2015-05-01 07:51:52 DEBUG CoarseGrainedExecutorBackend:56 - [actor] handled message (22.96474 ms) LaunchTask(org.apache.spark.util.SerializableBuffer@608752bf) from Actor[akka.tcp://sparkDriver@ sparkMasterIp:39176/user/CoarseGrainedScheduler#-970636338] 
2015-05-01 07:51:52 INFO  Executor:59 - Running task 0.0 in stage 0.0 (TID 0) 
2015-05-01 07:51:52 INFO  Executor:59 - Fetching http://sparkMasterIp:47166/jars/genconn-8fcc3a5.jar with timestamp 1430466444838 
2015-05-01 07:51:52 DEBUG Configuration:227 - java.io.IOException: config() 
        at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:227) 
        at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:214) 
        at org.apache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHadoopUtil.scala:78) 
        at org.apache.spark.executor.Executor.hadoopConf$lzycompute$1(Executor.scala:356) 
        at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$hadoopConf$1(Executor.scala:356) 
        at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:375) 
        at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:366) 
        at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772) 
        at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98) 
        at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98) 
        at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226) 
        at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39) 
        at scala.collection.mutable.HashMap.foreach(HashMap.scala:98) 
        at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771) 
        at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:366) 
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:184) 
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
        at java.lang.Thread.run(Thread.java:745) 

2015-05-01 07:51:52 DEBUG Utils:63 - fetchFile not using security 
2015-05-01 07:51:52 INFO  Utils:59 - Fetching http://sparkMasterIp:47166/jars/genconn-8fcc3a5.jar to /mnt/spark/spark-d745cbac-d1cc-47ee-9eba-e99e104732d5/spark-e3963fa3-cab6-4c69-8e78-d23246250a5d/spark-0eabace1-ee89-48a3-9a71-0218f0ffc61c/fetchFileTemp2001054150131059247.tmp 
2015-05-01 07:51:52 INFO  Utils:59 - Copying /mnt/spark/spark-d745cbac-d1cc-47ee-9eba-e99e104732d5/spark-e3963fa3-cab6-4c69-8e78-d23246250a5d/spark-0eabace1-ee89-48a3-9a71-0218f0ffc61c/18615094621430466444838_cache to /mnt/spark-work/app-20150501074725-0005/0/./genconn-8fcc3a5.jar 
2015-05-01 07:51:52 INFO  Executor:59 - Adding file:/mnt/spark-work/app-20150501074725-0005/0/./genconn-8fcc3a5.jar to class loader 
2015-05-01 07:51:52 DEBUG Configuration:227 - java.io.IOException: config() 
        at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:227) 
        at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:214) 
        at org.apache.spark.SerializableWritable$$anonfun$readObject$1.apply$mcV$sp(SerializableWritable.scala:42) 
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1137) 
        at org.apache.spark.SerializableWritable.readObject(SerializableWritable.scala:39) 
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
        at java.lang.reflect.Method.invoke(Method.java:606) 
        at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017) 
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893) 
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) 
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) 
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) 
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) 
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) 
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) 
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) 
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) 
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) 
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) 
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370) 
        at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:68) 
        at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:94) 
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:185) 
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
        at java.lang.Thread.run(Thread.java:745) 

2015-05-01 07:51:52 ERROR Executor:96 - Exception in task 0.0 in stage 0.0 (TID 0) 
*the error that is printed in the driver log*