Apache spark 错误。jobserver.JobManagerActor[]]-由于异常,即将重新启动actor:java.lang.NullPointerException

Apache spark 错误。jobserver.JobManagerActor[]]-由于异常,即将重新启动actor:java.lang.NullPointerException,apache-spark,cassandra,spark-jobserver,Apache Spark,Cassandra,Spark Jobserver,我在暂停此链接时出错:。 我正在尝试编写一个可以连接到Cassandra和Redis的服务,但在连接Cassandra的第一步中,我遇到了错误 [2017-05-06 17:42:54,700] ERROR .jobserver.JobManagerActor [] [] - About to restart actor due to exception: java.lang.NullPointerException at spark.jobserver.JobManagerAct

我在暂停此链接时出错:。 我正在尝试编写一个可以连接到Cassandra和Redis的服务,但在连接Cassandra的第一步中,我遇到了错误

   [2017-05-06 17:42:54,700] ERROR .jobserver.JobManagerActor [] [] - About to restart actor due to exception:
java.lang.NullPointerException
    at spark.jobserver.JobManagerActor$$anonfun$wrappedReceive$1.applyOrElse(JobManagerActor.scala:168)
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
    at spark.jobserver.common.akka.ActorStack$$anonfun$receive$1.applyOrElse(ActorStack.scala:33)
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
    at spark.jobserver.common.akka.Slf4jLogging$$anonfun$receive$1$$anonfun$applyOrElse$1.apply$mcV$sp(Slf4jLogging.scala:25)
    at spark.jobserver.common.akka.Slf4jLogging$class.spark$jobserver$common$akka$Slf4jLogging$$withAkkaSourceLogging(Slf4jLogging.scala:34)
    at spark.jobserver.common.akka.Slf4jLogging$$anonfun$receive$1.applyOrElse(Slf4jLogging.scala:24)
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
    at spark.jobserver.common.akka.ActorMetrics$$anonfun$receive$1.applyOrElse(ActorMetrics.scala:23)
    at akka.actor.Actor$class.aroundReceive(Actor.scala:467)
    at spark.jobserver.common.akka.InstrumentedActor.aroundReceive(InstrumentedActor.scala:8)
    at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
    at akka.actor.ActorCell.invoke(ActorCell.scala:487)
    at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
    at akka.dispatch.Mailbox.run(Mailbox.scala:220)
    at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
[2017-05-06 17:42:54,701] INFO  .jobserver.JobManagerActor [] [] - Shutting down SparkContext test-context
[2017-05-06 17:42:54,700] ERROR ka.actor.OneForOneStrategy [] [akka://JobServer/user/context-supervisor/test-context] -
java.lang.NullPointerException
    at spark.jobserver.JobManagerActor$$anonfun$wrappedReceive$1.applyOrElse(JobManagerActor.scala:168)
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
    at spark.jobserver.common.akka.ActorStack$$anonfun$receive$1.applyOrElse(ActorStack.scala:33)
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
    at spark.jobserver.common.akka.Slf4jLogging$$anonfun$receive$1$$anonfun$applyOrElse$1.apply$mcV$sp(Slf4jLogging.scala:25)
    at spark.jobserver.common.akka.Slf4jLogging$class.spark$jobserver$common$akka$Slf4jLogging$$withAkkaSourceLogging(Slf4jLogging.scala:34)
    at spark.jobserver.common.akka.Slf4jLogging$$anonfun$receive$1.applyOrElse(Slf4jLogging.scala:24)
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
    at spark.jobserver.common.akka.ActorMetrics$$anonfun$receive$1.applyOrElse(ActorMetrics.scala:23)
    at akka.actor.Actor$class.aroundReceive(Actor.scala:467)
    at spark.jobserver.common.akka.InstrumentedActor.aroundReceive(InstrumentedActor.scala:8)
    at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
    at akka.actor.ActorCell.invoke(ActorCell.scala:487)
    at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
    at akka.dispatch.Mailbox.run(Mailbox.scala:220)
    at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
[2017-05-06 17:42:54,701] INFO  .jobserver.JobManagerActor [] [akka://JobServer/user/context-supervisor/test-context] - Starting actor spark.jobserver.JobManagerActor
这是我在job server/job-server-local.log中复制的日志信息。你们对这个问题有什么建议吗。 多谢各位

这是我的密码 示例01.scala

import com.typesafe.config.Config
import org.apache.spark._
import spark.jobserver.api.{SparkJob => NewSparkJob, _}

import scala.util.Try
import com.datastax.spark.connector._
import org.scalactic._

object Example01 extends NewSparkJob {
  type JobData = Seq[String]
  type JobOutput = Long

  override def runJob(sc: SparkContext, runtime: JobEnvironment, data: JobData):
  JobOutput = {
    val rdd = sc.cassandraTable("test", "table_test") //create an rdd from the system.schema_keyspaces table
    val num_row: Long = rdd.count() // count the number of rows in the rdd
    num_row // return the result
  }

  def validate(sc: SparkContext, runtime: JobEnvironment, config: Config):
  JobData Or Every[ValidationProblem] = {
    Try(config.getString("input.string").split(" ").toSeq)
      .map(words => Good(words))
      .getOrElse(Bad(One(SingleProblem("No input.string param"))))
  }
}
sbt.build

name := "spark_job_assembly"

version := "1.0"

scalaVersion := "2.11.8"

val akkaVersion = "2.4.4"
val phantomVersion = "2.0.2"
val sparkVersion = "1.6.2"
val sparkJobVersion = "0.7.0"
val sparkCassandraVersion = "1.6"
val cassandraVersion = "3.1.3"
val playJsonVersion = "2.5.14"
val scalaTestVersion = "2.2.6"

resolvers ++= Seq(
  "justwrote" at "http://repo.justwrote.it/releases/",
  "Fabricator" at "http://dl.bintray.com/biercoff/Fabricator",
  "Typesafe Releases" at "https://repo.typesafe.com/typesafe/releases/",
  "Artima Maven Repository" at "http://repo.artima.com/releases",
  "Spark Packages Repo" at "https://dl.bintray.com/spark-packages/maven",
  "JBoss" at "https://repository.jboss.org/",
  "Job Server Bintray" at "https://dl.bintray.com/spark-jobserver/maven",
  "Typesafe repository" at "http://repo.typesafe.com/typesafe/releases/"
)

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % sparkVersion % "provided",
  "org.apache.spark" %% "spark-sql" % sparkVersion,
  "com.datastax.spark" %% "spark-cassandra-connector" % sparkCassandraVersion,
  "spark.jobserver" %% "job-server-api" % sparkJobVersion
//  ,
//  "net.liftweb" %% "lift-json" % "2.5+"
)

assemblyMergeStrategy in assembly := {
  case PathList("META-INF", xs@_*) => MergeStrategy.discard
  case x => MergeStrategy.last
}
plugins.sbt

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.0")
我使用的是SJS版本0.7.0

以及所有对SJS的查询

curl --data-binary @target/scala-2.11/spark_job_assembly-assembly-1.0.jar http://localhost:8090/jars/cassandra-demo

curl -d "" 'http://localhost:8090/contexts/cassandra-test?spark.cassandra.connection.host=localhost&spark.cassandra.connection.username=admin&spark.cassandra.connection.password=123456&num-cpu-cores=2'

curl -d "" 'http://localhost:8090/jobs?appName=cassandra-demo&classPath=examples.Example01&context=cassandra-test'

curl http://localhost:8090/jars
curl http://localhost:8090/contexts/

您使用的是哪个版本的SJS和Spark?我使用的是SJS版本0.7.0。错误似乎与Cassandra连接器无关。是的,我知道,因为我测试了Cassandra连接器,但我不知道为什么每次尝试连接Cassandra时,SJS都会自动失效。您能给出简单的重新生产方法吗?