Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/maven/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Maven Spark submit未下载包,ClassNotFound作业异常_Maven_Apache Spark_Deployment_Spark Streaming - Fatal编程技术网

Maven Spark submit未下载包,ClassNotFound作业异常

Maven Spark submit未下载包,ClassNotFound作业异常,maven,apache-spark,deployment,spark-streaming,Maven,Apache Spark,Deployment,Spark Streaming,我的spark submit脚本为我的依赖项指定“-packages”和maven坐标列表,但它不会在workers文件系统上放置任何JAR;我在应用程序或驱动程序文件夹中找不到我的依赖项。在日志中,我可以看到maven活动,但在spark作业中,它有相对于我的开发环境的文件路径。如果没有这些依赖项,作业将失败,并出现ClassNotFoundException。--packages是否能够下载我需要的资源?我是否应该放弃这种策略,改用uberjars 我正在运行一个小型的单主机spark集群—

我的spark submit脚本为我的依赖项指定“-packages”和maven坐标列表,但它不会在workers文件系统上放置任何JAR;我在应用程序或驱动程序文件夹中找不到我的依赖项。在日志中,我可以看到maven活动,但在spark作业中,它有相对于我的开发环境的文件路径。如果没有这些依赖项,作业将失败,并出现ClassNotFoundException。
--packages
是否能够下载我需要的资源?我是否应该放弃这种策略,改用uberjars

我正在运行一个小型的单主机spark集群——在我学习诀窍的同时,它是一对并置的主/工作者。我的spark图像库:

FROM openjdk:8-jre

RUN apt-get update && apt-get install -y python git

# 2.0 maintenance branch with stability fixes on top of Spark 2.0.1
RUN git clone git://github.com/apache/spark.git -b branch-2.0 && cd spark
    && ./build/mvn -P
在该基础之上的主配置:

FROM registry.cloud/spark

CMD ["./bin/spark-class", "org.apache.spark.deploy.master.Master"]
FROM registry.cloud/spark

EXPOSE 8081
CMD ["./bin/spark-class", "org.apache.spark.deploy.worker.Worker", "--host", "10.60.68.55", "--webui-port", "8081", "--memory", "16G", "spark://10.60.68.55:7077"]
以及位于该基础之上的worker配置:

FROM registry.cloud/spark

CMD ["./bin/spark-class", "org.apache.spark.deploy.master.Master"]
FROM registry.cloud/spark

EXPOSE 8081
CMD ["./bin/spark-class", "org.apache.spark.deploy.worker.Worker", "--host", "10.60.68.55", "--webui-port", "8081", "--memory", "16G", "spark://10.60.68.55:7077"]
使用scala编写的简单作业:

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

import org.apache.kafka.clients.consumer.ConsumerRecord
import org.apache.kafka.common.serialization.StringDeserializer
import org.apache.spark.streaming.kafka010._
import org.apache.spark.streaming.kafka010.LocationStrategies.PreferConsistent
import org.apache.spark.streaming.kafka010.ConsumerStrategies.Subscribe

import org.apache.spark.streaming.StreamingContext
import org.apache.spark.streaming.Seconds

object SimpleKafkaReader {
  def main(args: Array[String]) {
    println("Starting Xavier's Cool Kafka Reader!")

    val conf = new SparkConf().setAppName("Simple Application")
    conf.set("es.index.auto.create", "true")

    val sc = new SparkContext(conf)
    val scc = new StreamingContext(sc, Seconds(5))

    val kafkaParams = Map[String, Object](
      "bootstrap.servers" -> "my-kafka-server:9092",
      "key.deserializer" -> classOf[StringDeserializer],
      "value.deserializer" -> classOf[StringDeserializer],
      "group.id" -> "example",
      "auto.offset.reset" -> "earliest",
      "enable.auto.commit" -> (false: java.lang.Boolean)
    )

    val topics = Array("windows")

    val stream = KafkaUtils.createDirectStream[String, String](
      scc,
      PreferConsistent,
      Subscribe[String, String](topics, kafkaParams)
    )

    val mappy_thing = stream.map(record => (record.key, record.value))

    mappy_thing.print(1)

    scc.start()
    scc.awaitTermination()
    scc.stop()
  }
}
我的简单提交脚本:

#!/bin/bash
set -ex

sbt package

openstack object create spark target/scala-2.11/lalala-streaming_2.11-1.0.jar
JAR_URL=https://cloudy-storage:8080/v1/AUTH_asdf/spark/target/scala-2.11/lalala-streaming_2.11-1.0.jar

STREAMING=org.apache.spark:spark-streaming_2.11:2.0.1
STREAMING_KAFKA=org.apache.spark:spark-streaming-kafka-0-10_2.11:2.0.1
ES_SPARK=org.elasticsearch:elasticsearch-spark-20_2.11:5.0.0

spark-submit \
    --packages $STREAMING,$STREAMING_KAFKA,$ES_SPARK \
    --class SimpleKafkaReader \
    --master spark://10.60.68.55:6066 \
    --deploy-mode cluster \
    --verbose \
    $JAR_URL
提交日志:

+ sbt package:lalala-streaming xlange$ ./submit.sh
[info] Loading global plugins from /Users/xlange/.sbt/0.13/plugins
[info] Loading project definition from /Users/xlange/IdeaProjects/lalala-streaming/project
[info] Set current project to lalala-streaming (in build file:/Users/xlange/IdeaProjects/lalala-streaming/)
[warn] Multiple main classes detected.  Run 'show discoveredMainClasses' to see the list
[success] Total time: 0 s, completed Nov 7, 2016 2:12:53 PM
+ openstack object create spark target/scala-2.11/lalala-streaming_2.11-1.0.jar
+-------------------------------------------------+-----------+----------------------------------+
| object                                          | container | etag                             |
+-------------------------------------------------+-----------+----------------------------------+
| target/scala-2.11/lalala-streaming_2.11-1.0.jar | spark     | 36b7169228f346d28a7b0ab61a3046cd |
+-------------------------------------------------+-----------+----------------------------------+
+ JAR_URL=https://storage.cloud:8080/v1/AUTH_keything/spark/target/scala-2.11/lalala-streaming_2.11-1.0.jar
+ STREAMING=org.apache.spark:spark-streaming_2.11:2.0.1
+ STREAMING_KAFKA=org.apache.spark:spark-streaming-kafka-0-10_2.11:2.0.1
+ ES_SPARK=org.elasticsearch:elasticsearch-spark-20_2.11:5.0.0
+ spark-submit --packages org.apache.spark:spark-streaming_2.11:2.0.1,org.apache.spark:spark-streaming-kafka-0-10_2.11:2.0.1,org.elasticsearch:elasticsearch-spark-20_2.11:5.0.0 --class SimpleKafkaReader --master spark://10.60.68.55:6066 --deploy-mode cluster --verbose https://storage.cloud:8080/v1/AUTH_keything/spark/target/scala-2.11/lalala-streaming_2.11-1.0.jar
Using properties file: null
Parsed arguments:
  master                  spark://10.60.68.55:6066
  deployMode              cluster
  executorMemory          null
  executorCores           null
  totalExecutorCores      null
  propertiesFile          null
  driverMemory            null
  driverCores             null
  driverExtraClassPath    null
  driverExtraLibraryPath  null
  driverExtraJavaOptions  null
  supervise               false
  queue                   null
  numExecutors            null
  files                   null
  pyFiles                 null
  archives                null
  mainClass               SimpleKafkaReader
  primaryResource         https://storage.cloud:8080/v1/AUTH_keything/spark/target/scala-2.11/lalala-streaming_2.11-1.0.jar
  name                    SimpleKafkaReader
  childArgs               []
  jars                    null
  packages                org.apache.spark:spark-streaming_2.11:2.0.1,org.apache.spark:spark-streaming-kafka-0-10_2.11:2.0.1,org.elasticsearch:elasticsearch-spark-20_2.11:5.0.0
  packagesExclusions      null
  repositories            null
  verbose                 true

Spark properties used, including those specified through
 --conf and those from the properties file null:



Ivy Default Cache set to: /Users/xlange/.ivy2/cache
The jars for the packages stored in: /Users/xlange/.ivy2/jars
:: loading settings :: url = jar:file:/usr/local/Cellar/apache-spark/2.0.1/libexec/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
org.apache.spark#spark-streaming_2.11 added as a dependency
org.apache.spark#spark-streaming-kafka-0-10_2.11 added as a dependency
org.elasticsearch#elasticsearch-spark-20_2.11 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
        confs: [default]
        found org.apache.spark#spark-streaming-kafka-0-10_2.11;2.0.1 in central
        found org.apache.kafka#kafka_2.11;0.10.0.1 in list
        found com.101tec#zkclient;0.8 in list
        found org.slf4j#slf4j-api;1.7.16 in list
        found org.slf4j#slf4j-log4j12;1.7.16 in list
        found log4j#log4j;1.2.17 in list
        found com.yammer.metrics#metrics-core;2.2.0 in list
        found org.scala-lang.modules#scala-parser-combinators_2.11;1.0.4 in list
        found org.apache.kafka#kafka-clients;0.10.0.1 in list
        found net.jpountz.lz4#lz4;1.3.0 in list
        found org.xerial.snappy#snappy-java;1.1.2.6 in list
        found org.apache.spark#spark-tags_2.11;2.0.1 in central
        found org.scalatest#scalatest_2.11;2.2.6 in list
        found org.scala-lang#scala-reflect;2.11.8 in list
        [2.11.8] org.scala-lang#scala-reflect;2.11.8
        found org.scala-lang.modules#scala-xml_2.11;1.0.2 in list
        found org.spark-project.spark#unused;1.0.0 in list
        found org.elasticsearch#elasticsearch-spark-20_2.11;5.0.0 in list
:: resolution report :: resolve 3145ms :: artifacts dl 9ms
        :: modules in use:
        com.101tec#zkclient;0.8 from list in [default]
        com.yammer.metrics#metrics-core;2.2.0 from list in [default]
        log4j#log4j;1.2.17 from list in [default]
        net.jpountz.lz4#lz4;1.3.0 from list in [default]
        org.apache.kafka#kafka-clients;0.10.0.1 from list in [default]
        org.apache.kafka#kafka_2.11;0.10.0.1 from list in [default]
        org.apache.spark#spark-streaming-kafka-0-10_2.11;2.0.1 from central in [default]
        org.apache.spark#spark-tags_2.11;2.0.1 from central in [default]
        org.elasticsearch#elasticsearch-spark-20_2.11;5.0.0 from list in [default]
        org.scala-lang#scala-reflect;2.11.8 from list in [default]
        org.scala-lang.modules#scala-parser-combinators_2.11;1.0.4 from list in [default]
        org.scala-lang.modules#scala-xml_2.11;1.0.2 from list in [default]
        org.scalatest#scalatest_2.11;2.2.6 from list in [default]
        org.slf4j#slf4j-api;1.7.16 from list in [default]
        org.slf4j#slf4j-log4j12;1.7.16 from list in [default]
        org.spark-project.spark#unused;1.0.0 from list in [default]
        org.xerial.snappy#snappy-java;1.1.2.6 from list in [default]
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   17  |   2   |   2   |   0   ||   17  |   0   |
        ---------------------------------------------------------------------

:: problems summary ::
:::: ERRORS
        unknown resolver null


:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
:: retrieving :: org.apache.spark#spark-submit-parent
        confs: [default]
        0 artifacts copied, 17 already retrieved (0kB/9ms)
Running Spark using the REST application submission protocol.
Main class:
org.apache.spark.deploy.rest.RestSubmissionClient
Arguments:
https://storage.cloud:8080/v1/AUTH_keything/spark/target/scala-2.11/lalala-streaming_2.11-1.0.jar
SimpleKafkaReader
System properties:
SPARK_SUBMIT -> true
spark.driver.supervise -> false
spark.app.name -> SimpleKafkaReader
spark.jars -> file:/Users/xlange/.ivy2/jars/org.apache.spark_spark-streaming-kafka-0-10_2.11-2.0.1.jar,file:/Users/xlange/.ivy2/jars/org.elasticsearch_elasticsearch-spark-20_2.11-5.0.0.jar,file:/Users/xlange/.ivy2/jars/org.apache.kafka_kafka_2.11-0.10.0.1.jar,file:/Users/xlange/.ivy2/jars/org.apache.spark_spark-tags_2.11-2.0.1.jar,file:/Users/xlange/.ivy2/jars/org.spark-project.spark_unused-1.0.0.jar,file:/Users/xlange/.ivy2/jars/com.101tec_zkclient-0.8.jar,file:/Users/xlange/.ivy2/jars/org.slf4j_slf4j-log4j12-1.7.16.jar,file:/Users/xlange/.ivy2/jars/com.yammer.metrics_metrics-core-2.2.0.jar,file:/Users/xlange/.ivy2/jars/org.scala-lang.modules_scala-parser-combinators_2.11-1.0.4.jar,file:/Users/xlange/.ivy2/jars/org.apache.kafka_kafka-clients-0.10.0.1.jar,file:/Users/xlange/.ivy2/jars/org.slf4j_slf4j-api-1.7.16.jar,file:/Users/xlange/.ivy2/jars/log4j_log4j-1.2.17.jar,file:/Users/xlange/.ivy2/jars/net.jpountz.lz4_lz4-1.3.0.jar,file:/Users/xlange/.ivy2/jars/org.xerial.snappy_snappy-java-1.1.2.6.jar,file:/Users/xlange/.ivy2/jars/org.scalatest_scalatest_2.11-2.2.6.jar,file:/Users/xlange/.ivy2/jars/org.scala-lang_scala-reflect-2.11.8.jar,file:/Users/xlange/.ivy2/jars/org.scala-lang.modules_scala-xml_2.11-1.0.2.jar,https://storage.cloud:8080/v1/AUTH_keything/spark/target/scala-2.11/lalala-streaming_2.11-1.0.jar
spark.submit.deployMode -> cluster
spark.master -> spark://10.60.68.55:6066
Classpath elements:



Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/11/07 14:13:00 INFO RestSubmissionClient: Submitting a request to launch an application in spark://10.60.68.55:6066.
16/11/07 14:13:01 INFO RestSubmissionClient: Submission successfully created as driver-20161107221301-0011. Polling submission state...
16/11/07 14:13:01 INFO RestSubmissionClient: Submitting a request for the status of submission driver-20161107221301-0011 in spark://10.60.68.55:6066.
16/11/07 14:13:01 INFO RestSubmissionClient: State of driver driver-20161107221301-0011 is now RUNNING.
16/11/07 14:13:01 INFO RestSubmissionClient: Driver is running on worker worker-20161101181541-10.60.68.55-46791 at 10.60.68.55:46791.
16/11/07 14:13:01 INFO RestSubmissionClient: Server responded with CreateSubmissionResponse:
{
  "action" : "CreateSubmissionResponse",
  "message" : "Driver successfully submitted as driver-20161107221301-0011",
  "serverSparkVersion" : "2.0.1",
  "submissionId" : "driver-20161107221301-0011",
  "success" : true
}
+ exit 0
spark母版网页显示:

app-20161107221304-0056 Simple Application  7   1024.0 MB   2016/11/07 22:13:04 root    FINISHED    0.3 s
在worker上,我找到了驱动程序的stderr:

Launch Command: "/usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java" "-cp" "/usr/share/spark-2.0.1-bin-hadoop2.7/conf/:/usr/share/spark-2.0.1-bin-hadoop2.7/jars/*" "-Xmx1024M" "-Dspark.master=spark://10.60.68.55:7077" "-Dspark.app.name=SimpleKafkaReader" "-Dspark.driver.supervise=false" "-Dspark.submit.deployMode=cluster" "-Dspark.jars=file:/Users/xlange/.ivy2/jars/org.apache.spark_spark-streaming-kafka-0-10_2.11-2.0.1.jar,file:/Users/xlange/.ivy2/jars/org.elasticsearch_elasticsearch-spark-20_2.11-5.0.0.jar,file:/Users/xlange/.ivy2/jars/org.apache.kafka_kafka_2.11-0.10.0.1.jar,file:/Users/xlange/.ivy2/jars/org.apache.spark_spark-tags_2.11-2.0.1.jar,file:/Users/xlange/.ivy2/jars/org.spark-project.spark_unused-1.0.0.jar,file:/Users/xlange/.ivy2/jars/com.101tec_zkclient-0.8.jar,file:/Users/xlange/.ivy2/jars/org.slf4j_slf4j-log4j12-1.7.16.jar,file:/Users/xlange/.ivy2/jars/com.yammer.metrics_metrics-core-2.2.0.jar,file:/Users/xlange/.ivy2/jars/org.scala-lang.modules_scala-parser-combinators_2.11-1.0.4.jar,file:/Users/xlange/.ivy2/jars/org.apache.kafka_kafka-clients-0.10.0.1.jar,file:/Users/xlange/.ivy2/jars/org.slf4j_slf4j-api-1.7.16.jar,file:/Users/xlange/.ivy2/jars/log4j_log4j-1.2.17.jar,file:/Users/xlange/.ivy2/jars/net.jpountz.lz4_lz4-1.3.0.jar,file:/Users/xlange/.ivy2/jars/org.xerial.snappy_snappy-java-1.1.2.6.jar,file:/Users/xlange/.ivy2/jars/org.scalatest_scalatest_2.11-2.2.6.jar,file:/Users/xlange/.ivy2/jars/org.scala-lang_scala-reflect-2.11.8.jar,file:/Users/xlange/.ivy2/jars/org.scala-lang.modules_scala-xml_2.11-1.0.2.jar,https://storage.cloud:8080/v1/AUTH_keything/spark/target/scala-2.11/lalala-streaming_2.11-1.0.jar" "org.apache.spark.deploy.worker.DriverWrapper" "spark://Worker@10.60.68.55:46791" "/usr/share/spark-2.0.1-bin-hadoop2.7/work/driver-20161107221301-0011/lalala-streaming_2.11-1.0.jar" "SimpleKafkaReader"
========================================

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/11/07 22:13:02 WARN Utils: Your hostname, scb-spark-01.openstacklocal resolves to a loopback address: 127.0.0.1; using 10.60.68.55 instead (on interface eth0)
16/11/07 22:13:02 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
16/11/07 22:13:03 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/11/07 22:13:03 INFO SecurityManager: Changing view acls to: root
16/11/07 22:13:03 INFO SecurityManager: Changing modify acls to: root
16/11/07 22:13:03 INFO SecurityManager: Changing view acls groups to: 
16/11/07 22:13:03 INFO SecurityManager: Changing modify acls groups to: 
16/11/07 22:13:03 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
16/11/07 22:13:03 INFO Utils: Successfully started service 'Driver' on port 38125.
16/11/07 22:13:03 INFO WorkerWatcher: Connecting to worker spark://Worker@10.60.68.55:46791
16/11/07 22:13:03 INFO SparkContext: Running Spark version 2.0.1
16/11/07 22:13:03 INFO TransportClientFactory: Successfully created connection to /10.60.68.55:46791 after 45 ms (0 ms spent in bootstraps)
16/11/07 22:13:03 INFO WorkerWatcher: Successfully connected to spark://Worker@10.60.68.55:46791
16/11/07 22:13:03 INFO SecurityManager: Changing view acls to: root
16/11/07 22:13:03 INFO SecurityManager: Changing modify acls to: root
16/11/07 22:13:03 INFO SecurityManager: Changing view acls groups to: 
16/11/07 22:13:03 INFO SecurityManager: Changing modify acls groups to: 
16/11/07 22:13:03 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
16/11/07 22:13:03 INFO Utils: Successfully started service 'sparkDriver' on port 42969.
16/11/07 22:13:03 INFO SparkEnv: Registering MapOutputTracker
16/11/07 22:13:03 INFO SparkEnv: Registering BlockManagerMaster
16/11/07 22:13:03 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-93eb35b2-5378-4e36-87c1-2733ea47e5cb
16/11/07 22:13:03 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
16/11/07 22:13:04 INFO SparkEnv: Registering OutputCommitCoordinator
16/11/07 22:13:04 INFO Utils: Successfully started service 'SparkUI' on port 4040.
16/11/07 22:13:04 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.60.68.55:4040
16/11/07 22:13:04 INFO SparkContext: Added JAR file:/Users/xlange/.ivy2/jars/org.apache.spark_spark-streaming-kafka-0-10_2.11-2.0.1.jar at spark://10.60.68.55:42969/jars/org.apache.spark_spark-streaming-kafka-0-10_2.11-2.0.1.jar with timestamp 1478556784389
16/11/07 22:13:04 INFO SparkContext: Added JAR file:/Users/xlange/.ivy2/jars/org.elasticsearch_elasticsearch-spark-20_2.11-5.0.0.jar at spark://10.60.68.55:42969/jars/org.elasticsearch_elasticsearch-spark-20_2.11-5.0.0.jar with timestamp 1478556784390
16/11/07 22:13:04 INFO SparkContext: Added JAR file:/Users/xlange/.ivy2/jars/org.apache.kafka_kafka_2.11-0.10.0.1.jar at spark://10.60.68.55:42969/jars/org.apache.kafka_kafka_2.11-0.10.0.1.jar with timestamp 1478556784391
16/11/07 22:13:04 INFO SparkContext: Added JAR file:/Users/xlange/.ivy2/jars/org.apache.spark_spark-tags_2.11-2.0.1.jar at spark://10.60.68.55:42969/jars/org.apache.spark_spark-tags_2.11-2.0.1.jar with timestamp 1478556784391
16/11/07 22:13:04 INFO SparkContext: Added JAR file:/Users/xlange/.ivy2/jars/org.spark-project.spark_unused-1.0.0.jar at spark://10.60.68.55:42969/jars/org.spark-project.spark_unused-1.0.0.jar with timestamp 1478556784391
16/11/07 22:13:04 INFO SparkContext: Added JAR file:/Users/xlange/.ivy2/jars/com.101tec_zkclient-0.8.jar at spark://10.60.68.55:42969/jars/com.101tec_zkclient-0.8.jar with timestamp 1478556784391
16/11/07 22:13:04 INFO SparkContext: Added JAR file:/Users/xlange/.ivy2/jars/org.slf4j_slf4j-log4j12-1.7.16.jar at spark://10.60.68.55:42969/jars/org.slf4j_slf4j-log4j12-1.7.16.jar with timestamp 1478556784392
16/11/07 22:13:04 INFO SparkContext: Added JAR file:/Users/xlange/.ivy2/jars/com.yammer.metrics_metrics-core-2.2.0.jar at spark://10.60.68.55:42969/jars/com.yammer.metrics_metrics-core-2.2.0.jar with timestamp 1478556784392
16/11/07 22:13:04 INFO SparkContext: Added JAR file:/Users/xlange/.ivy2/jars/org.scala-lang.modules_scala-parser-combinators_2.11-1.0.4.jar at spark://10.60.68.55:42969/jars/org.scala-lang.modules_scala-parser-combinators_2.11-1.0.4.jar with timestamp 1478556784392
16/11/07 22:13:04 INFO SparkContext: Added JAR file:/Users/xlange/.ivy2/jars/org.apache.kafka_kafka-clients-0.10.0.1.jar at spark://10.60.68.55:42969/jars/org.apache.kafka_kafka-clients-0.10.0.1.jar with timestamp 1478556784393
16/11/07 22:13:04 INFO SparkContext: Added JAR file:/Users/xlange/.ivy2/jars/org.slf4j_slf4j-api-1.7.16.jar at spark://10.60.68.55:42969/jars/org.slf4j_slf4j-api-1.7.16.jar with timestamp 1478556784393
16/11/07 22:13:04 INFO SparkContext: Added JAR file:/Users/xlange/.ivy2/jars/log4j_log4j-1.2.17.jar at spark://10.60.68.55:42969/jars/log4j_log4j-1.2.17.jar with timestamp 1478556784393
16/11/07 22:13:04 INFO SparkContext: Added JAR file:/Users/xlange/.ivy2/jars/net.jpountz.lz4_lz4-1.3.0.jar at spark://10.60.68.55:42969/jars/net.jpountz.lz4_lz4-1.3.0.jar with timestamp 1478556784393
16/11/07 22:13:04 INFO SparkContext: Added JAR file:/Users/xlange/.ivy2/jars/org.xerial.snappy_snappy-java-1.1.2.6.jar at spark://10.60.68.55:42969/jars/org.xerial.snappy_snappy-java-1.1.2.6.jar with timestamp 1478556784394
16/11/07 22:13:04 INFO SparkContext: Added JAR file:/Users/xlange/.ivy2/jars/org.scalatest_scalatest_2.11-2.2.6.jar at spark://10.60.68.55:42969/jars/org.scalatest_scalatest_2.11-2.2.6.jar with timestamp 1478556784394
16/11/07 22:13:04 INFO SparkContext: Added JAR file:/Users/xlange/.ivy2/jars/org.scala-lang_scala-reflect-2.11.8.jar at spark://10.60.68.55:42969/jars/org.scala-lang_scala-reflect-2.11.8.jar with timestamp 1478556784394
16/11/07 22:13:04 INFO SparkContext: Added JAR file:/Users/xlange/.ivy2/jars/org.scala-lang.modules_scala-xml_2.11-1.0.2.jar at spark://10.60.68.55:42969/jars/org.scala-lang.modules_scala-xml_2.11-1.0.2.jar with timestamp 1478556784395
16/11/07 22:13:04 INFO SparkContext: Added JAR https://storage.cloud:8080/v1/AUTH_keything/spark/target/scala-2.11/lalala-streaming_2.11-1.0.jar at https://storage.cloud:8080/v1/AUTH_keything/spark/target/scala-2.11/lalala-streaming_2.11-1.0.jar with timestamp 1478556784396
16/11/07 22:13:04 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://10.60.68.55:7077...
16/11/07 22:13:04 INFO TransportClientFactory: Successfully created connection to /10.60.68.55:7077 after 2 ms (0 ms spent in bootstraps)
16/11/07 22:13:04 INFO StandaloneSchedulerBackend: Connected to Spark cluster with app ID app-20161107221304-0056
16/11/07 22:13:04 INFO StandaloneAppClient$ClientEndpoint: Executor added: app-20161107221304-0056/0 on worker-20161101181541-10.60.68.55-46791 (10.60.68.55:46791) with 7 cores
16/11/07 22:13:04 INFO StandaloneSchedulerBackend: Granted executor ID app-20161107221304-0056/0 on hostPort 10.60.68.55:46791 with 7 cores, 1024.0 MB RAM
16/11/07 22:13:04 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 37535.
16/11/07 22:13:04 INFO NettyBlockTransferService: Server created on 10.60.68.55:37535
16/11/07 22:13:04 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.60.68.55, 37535)
16/11/07 22:13:04 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20161107221304-0056/0 is now RUNNING
16/11/07 22:13:04 INFO BlockManagerMasterEndpoint: Registering block manager 10.60.68.55:37535 with 366.3 MB RAM, BlockManagerId(driver, 10.60.68.55, 37535)
16/11/07 22:13:04 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.60.68.55, 37535)
16/11/07 22:13:04 INFO StandaloneSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
Exception in thread "main" java.lang.reflect.InvocationTargetException
  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:498)
  at org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:58)
  at org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)
Caused by: java.lang.NoClassDefFoundError: org/apache/kafka/common/serialization/StringDeserializer
  at SimpleKafkaReader$.main(SimpleKafkaReader.scala:30)
  at SimpleKafkaReader.main(SimpleKafkaReader.scala)
  ... 6 more
Caused by: java.lang.ClassNotFoundException: org.apache.kafka.common.serialization.StringDeserializer
  at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
  ... 8 more
16/11/07 22:13:04 INFO SparkContext: Invoking stop() from shutdown hook
16/11/07 22:13:04 INFO SparkUI: Stopped Spark web UI at http://10.60.68.55:4040
16/11/07 22:13:04 INFO StandaloneSchedulerBackend: Shutting down all executors
16/11/07 22:13:04 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asking each executor to shut down
16/11/07 22:13:04 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/11/07 22:13:04 INFO MemoryStore: MemoryStore cleared
16/11/07 22:13:04 INFO BlockManager: BlockManager stopped
16/11/07 22:13:04 INFO BlockManagerMaster: BlockManagerMaster stopped
16/11/07 22:13:04 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
16/11/07 22:13:04 INFO SparkContext: Successfully stopped SparkContext
16/11/07 22:13:04 INFO ShutdownHookManager: Shutdown hook called
16/11/07 22:13:04 INFO ShutdownHookManager: Deleting directory /tmp/spark-46220409-74ee-4d58-ade7-2383c64b9a5a