Scala 无法使用sbt在本地运行spark作业,但可以在IntelliJ中运行

Scala 无法使用sbt在本地运行spark作业,但可以在IntelliJ中运行,scala,apache-spark,sbt,Scala,Apache Spark,Sbt,我已经为它们编写了一些简单的Spark作业和一些测试。我用IntelliJ做了所有的事情,效果很好。现在,我想确保我的代码是用sbt构建的。编译很好,但在运行和测试过程中会出现奇怪的错误 我正在使用Scala版本2.11.8和sbt版本0.13.8 我的build.sbt文件如下所示: name := "test" version := "1.0" scalaVersion := "2.11.7" libraryDependencies += "org.apache.spark" %% "

我已经为它们编写了一些简单的Spark作业和一些测试。我用IntelliJ做了所有的事情,效果很好。现在,我想确保我的代码是用
sbt
构建的。编译很好,但在运行和测试过程中会出现奇怪的错误

我正在使用Scala版本
2.11.8
sbt
版本
0.13.8

我的
build.sbt
文件如下所示:

name := "test"

version := "1.0"

scalaVersion := "2.11.7"

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.0"
libraryDependencies += "javax.mail" % "javax.mail-api" % "1.5.6"
libraryDependencies += "com.sun.mail" % "javax.mail" % "1.5.6"
libraryDependencies += "commons-cli" % "commons-cli" % "1.3.1"
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "3.0.0" % "test"
libraryDependencies += "com.holdenkarau" % "spark-testing-base_2.11" % "2.0.0_0.4.4" % "test" intransitive()
我尝试使用
sbt“run main com.test.email.processor.bin.Runner”
运行代码,以下是输出:

[info] Loading project definition from /Users/max/workplace/test/project
[info] Set current project to test (in build file:/Users/max/workplace/test/)
[info] Running com.test.email.processor.bin.Runner -j recipientCount -e /Users/max/workplace/data/test/enron_with_categories/*/*.txt
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/08/23 18:46:55 INFO SparkContext: Running Spark version 2.0.0
16/08/23 18:46:55 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/08/23 18:46:55 INFO SecurityManager: Changing view acls to: max
16/08/23 18:46:55 INFO SecurityManager: Changing modify acls to: max
16/08/23 18:46:55 INFO SecurityManager: Changing view acls groups to: 
16/08/23 18:46:55 INFO SecurityManager: Changing modify acls groups to: 
16/08/23 18:46:55 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(max); groups with view permissions: Set(); users  with modify permissions: Set(max); groups with modify permissions: Set()
16/08/23 18:46:56 INFO Utils: Successfully started service 'sparkDriver' on port 61759.
16/08/23 18:46:56 INFO SparkEnv: Registering MapOutputTracker
16/08/23 18:46:56 INFO SparkEnv: Registering BlockManagerMaster
16/08/23 18:46:56 INFO DiskBlockManager: Created local directory at /private/var/folders/75/4dydy_6110v0gjv7bg265_g40000gn/T/blockmgr-9eb526c0-b7e5-444a-b186-d7f248c5dc62
16/08/23 18:46:56 INFO MemoryStore: MemoryStore started with capacity 408.9 MB
16/08/23 18:46:56 INFO SparkEnv: Registering OutputCommitCoordinator
16/08/23 18:46:56 INFO Utils: Successfully started service 'SparkUI' on port 4040.
16/08/23 18:46:56 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.1.11:4040
16/08/23 18:46:56 INFO Executor: Starting executor ID driver on host localhost
16/08/23 18:46:57 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 61760.
16/08/23 18:46:57 INFO NettyBlockTransferService: Server created on 192.168.1.11:61760
16/08/23 18:46:57 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.1.11, 61760)
16/08/23 18:46:57 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.1.11:61760 with 408.9 MB RAM, BlockManagerId(driver, 192.168.1.11, 61760)
16/08/23 18:46:57 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.1.11, 61760)
16/08/23 18:46:57 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 128.0 KB, free 408.8 MB)
16/08/23 18:46:57 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 14.6 KB, free 408.8 MB)
16/08/23 18:46:57 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.1.11:61760 (size: 14.6 KB, free: 408.9 MB)
16/08/23 18:46:57 INFO SparkContext: Created broadcast 0 from wholeTextFiles at RecipientCountJob.scala:22
16/08/23 18:46:58 WARN ClosureCleaner: Expected a closure; got com.test.email.processor.util.cleanEmail$
16/08/23 18:46:58 INFO FileInputFormat: Total input paths to process : 1702
16/08/23 18:46:58 INFO FileInputFormat: Total input paths to process : 1702
16/08/23 18:46:58 INFO CombineFileInputFormat: DEBUG: Terminated node allocation with : CompletedNodes: 1, size left: 0
16/08/23 18:46:58 INFO SparkContext: Starting job: take at RecipientCountJob.scala:35
16/08/23 18:46:58 WARN DAGScheduler: Creating new stage failed due to exception - job: 0
java.lang.ClassNotFoundException: scala.Function0
    at sbt.classpath.ClasspathFilter.loadClass(ClassLoaders.scala:63)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:348)
    at com.twitter.chill.KryoBase$$anonfun$1.apply(KryoBase.scala:41)
    at com.twitter.chill.KryoBase$$anonfun$1.apply(KryoBase.scala:41)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
    at scala.collection.immutable.Range.foreach(Range.scala:166)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:245)
    at scala.collection.AbstractTraversable.map(Traversable.scala:104)
    at com.twitter.chill.KryoBase.<init>(KryoBase.scala:41)
    at com.twitter.chill.EmptyScalaKryoInstantiator.newKryo(ScalaKryoInstantiator.scala:57)
    at org.apache.spark.serializer.KryoSerializer.newKryo(KryoSerializer.scala:86)
    at org.apache.spark.serializer.KryoSerializerInstance.borrowKryo(KryoSerializer.scala:274)
    at org.apache.spark.serializer.KryoSerializerInstance.<init>(KryoSerializer.scala:259)
    at org.apache.spark.serializer.KryoSerializer.newInstance(KryoSerializer.scala:175)
    at org.apache.spark.serializer.KryoSerializer.supportsRelocationOfSerializedObjects$lzycompute(KryoSerializer.scala:182)
    at org.apache.spark.serializer.KryoSerializer.supportsRelocationOfSerializedObjects(KryoSerializer.scala:178)
    at org.apache.spark.shuffle.sort.SortShuffleManager$.canUseSerializedShuffle(SortShuffleManager.scala:187)
    at org.apache.spark.shuffle.sort.SortShuffleManager.registerShuffle(SortShuffleManager.scala:99)
    at org.apache.spark.ShuffleDependency.<init>(Dependency.scala:90)
    at org.apache.spark.rdd.ShuffledRDD.getDependencies(ShuffledRDD.scala:91)
    at org.apache.spark.rdd.RDD$$anonfun$dependencies$2.apply(RDD.scala:235)
    at org.apache.spark.rdd.RDD$$anonfun$dependencies$2.apply(RDD.scala:233)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.rdd.RDD.dependencies(RDD.scala:233)
    at org.apache.spark.scheduler.DAGScheduler.visit$2(DAGScheduler.scala:418)
    at org.apache.spark.scheduler.DAGScheduler.getAncestorShuffleDependencies(DAGScheduler.scala:433)
    at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$getShuffleMapStage(DAGScheduler.scala:288)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$visit$1$1.apply(DAGScheduler.scala:394)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$visit$1$1.apply(DAGScheduler.scala:391)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.apache.spark.scheduler.DAGScheduler.visit$1(DAGScheduler.scala:391)
    at org.apache.spark.scheduler.DAGScheduler.getParentStages(DAGScheduler.scala:403)
    at org.apache.spark.scheduler.DAGScheduler.getParentStagesAndId(DAGScheduler.scala:304)
    at org.apache.spark.scheduler.DAGScheduler.newResultStage(DAGScheduler.scala:339)
    at org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:849)
    at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1626)
    at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1618)
    at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1607)
    at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
16/08/23 18:46:58 INFO DAGScheduler: Job 0 failed: take at RecipientCountJob.scala:35, took 0.076653 s
[error] (run-main-0) java.lang.ClassNotFoundException: scala.Function0
java.lang.ClassNotFoundException: scala.Function0
[trace] Stack trace suppressed: run last compile:runMain for the full output.
16/08/23 18:46:58 ERROR ContextCleaner: Error in cleaning thread
java.lang.InterruptedException
    at java.lang.Object.wait(Native Method)
    at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:143)
    at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply$mcV$sp(ContextCleaner.scala:175)
    at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1229)
    at org.apache.spark.ContextCleaner.org$apache$spark$ContextCleaner$$keepCleaning(ContextCleaner.scala:172)
    at org.apache.spark.ContextCleaner$$anon$1.run(ContextCleaner.scala:67)
16/08/23 18:46:58 ERROR Utils: uncaught error in thread SparkListenerBus, stopping SparkContext
java.lang.InterruptedException
    at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:998)
    at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304)
    at java.util.concurrent.Semaphore.acquire(Semaphore.java:312)
    at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(LiveListenerBus.scala:67)
    at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:66)
    at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:66)
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
    at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:65)
    at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1229)
    at org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:64)
java.lang.RuntimeException: Nonzero exit code: 1
[info]从/Users/max/workplace/test/project加载项目定义
[信息]将当前项目设置为测试(在生成文件中:/Users/max/workplace/test/)
[信息]运行com.test.email.processor.bin.Runner-j recipientCount-e/Users/max/workplace/data/test/enron\u和\u categories/*/*.txt
使用Spark的默认log4j配置文件:org/apache/Spark/log4j-defaults.properties
16/08/23 18:46:55信息SparkContext:运行Spark版本2.0.0
16/08/23 18:46:55警告NativeCodeLoader:无法为您的平台加载本机hadoop库。。。在适用的情况下使用内置java类
16/08/23 18:46:55信息安全管理器:将视图ACL更改为:max
16/08/23 18:46:55信息安全管理器:将修改ACL更改为:max
16/08/23 18:46:55信息安全管理器:将视图ACL组更改为:
16/08/23 18:46:55信息安全管理器:将修改ACL组更改为:
16/08/23 18:46:55信息安全管理器:安全管理器:身份验证已禁用;ui ACL被禁用;具有查看权限的用户:设置(最大);具有查看权限的组:Set();具有修改权限的用户:设置(最大);具有修改权限的组:Set()
16/08/23 18:46:56信息实用程序:已在端口61759上成功启动服务“sparkDriver”。
16/08/23 18:46:56信息SparkEnv:注册MapOutputRacker
16/08/23 18:46:56信息SparkEnv:注册BlockManagerMaster
16/08/23 18:46:56信息DiskBlockManager:已在/private/var/folders/75/4dyd_6110v0gjv7bg265_g40000gn/T/blockmgr-9eb526c0-b7e5-444a-b186-d7f248c5dc62创建本地目录
16/08/23 18:46:56信息MemoryStore:MemoryStore以408.9 MB的容量启动
16/08/23 18:46:56信息SparkEnv:正在注册OutputCommitCoordinator
16/08/23 18:46:56信息实用程序:已成功启动端口4040上的服务“SparkUI”。
16/08/23 18:46:56信息斯巴库:将斯巴库绑定到0.0.0.0,并从http://192.168.1.11:4040
16/08/23 18:46:56信息执行器:正在主机localhost上启动执行器ID驱动程序
16/08/23 18:46:57信息实用程序:已在端口61760上成功启动服务“org.apache.spark.network.netty.NettyBlockTransferService”。
16/08/23 18:46:57信息NettyBlockTransferService:服务器创建于192.168.1.11:61760
16/08/23 18:46:57信息BlockManagerMaster:注册BlockManager BlockManagerId(驱动程序,192.168.1.1161760)
16/08/23 18:46:57信息块管理器MasterEndpoint:使用408.9 MB RAM注册块管理器192.168.1.11:61760,块管理器RID(驱动程序,192.168.1.1161760)
16/08/23 18:46:57信息BlockManagerMaster:已注册的BlockManager BlockManagerId(驱动程序,192.168.1.1161760)
16/08/23 18:46:57信息内存存储:块广播0存储为内存中的值(估计大小128.0 KB,可用408.8 MB)
16/08/23 18:46:57信息内存存储:块广播存储在内存中作为字节存储(估计大小14.6KB,可用408.8MB)
16/08/23 18:46:57信息块管理信息:在192.168.1.11:61760上的内存中添加了广播片段0(大小:14.6KB,可用空间:408.9MB)
16/08/23 18:46:57信息SparkContext:从RecipientCountJob的Wholetext文件创建广播0。scala:22
16/08/23 18:46:58警告ClosureCleaner:预期关闭;获取com.test.email.processor.util.cleanEmail$
16/08/23 18:46:58信息文件InputFormat:进程的总输入路径:1702
16/08/23 18:46:58信息文件InputFormat:进程的总输入路径:1702
16/08/23 18:46:58信息组合输入格式:调试:终止节点分配,其中:CompletedNodes:1,左侧大小:0
16/08/23 18:46:58信息SparkContext:开始工作:接受RecipientCountJob.scala:35
16/08/23 18:46:58计划程序:由于异常,创建新阶段失败-作业:0
java.lang.ClassNotFoundException:scala.Function0
位于sbt.classpath.ClasspathFilter.loadClass(ClassLoaders.scala:63)
位于java.lang.ClassLoader.loadClass(ClassLoader.java:357)
位于java.lang.Class.forName0(本机方法)
位于java.lang.Class.forName(Class.java:348)
在com.twitter.chill.KryoBase$$anonfun$1.apply上(KryoBase.scala:41)
在com.twitter.chill.KryoBase$$anonfun$1.apply上(KryoBase.scala:41)
位于scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
位于scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
位于scala.collection.immutable.Range.foreach(Range.scala:166)
位于scala.collection.TraversableLike$class.map(TraversableLike.scala:245)
位于scala.collection.AbstractTraversable.map(Traversable.scala:104)
点击com.twitter.chill.KryoBase(KryoBase.scala:41)
在com.twitter.chill.emptyscalakryooinstantiator.newKryo(ScalaKryoInstantiator.scala:57)
位于org.apache.spark.serializer.KryoSerializer.newKryo(KryoSerializer.scala:86)
在org.apache.spark.serializer.KryoSerializerInstance.brookkryo(KryoSerializer.scala:274)
位于org.apache.spark.serializer.kryoserializerrinstance(KryoSerializer.scala:259)
位于org.apache.spark.serializer.KryoSerializer.newInstance(KryoSerializer.scala:175)
位于org.apache.spark.serializer.KryoSerializer.supportsRelocationOfSerializedObjects$lzycompute(KryoSerializer.scala:182)
位于org.apache.spark.serializer.KryoSerializer.supportsRelocationOfSerializedObjects(KryoSerializer.scala:178)
位于org.apache.spark.shuffle.sort.SortShuffleManager$.canUseSerializedShuffle(SortShuffleManager.scala:187)
在org.apache.spark.shuffle.sort.SortShuffleManager.registerShuffle(SortShuffleManager.scala:99)
位于org.apache.spark.shuffledependence(Dependency.scala:90)
在org.apache.spark.rdd.shuffleddd上