Maven 为什么building Spark RC发布失败,原因是;无法初始化类sun.util.calendar.ZoneInfo文件";?

Maven 为什么building Spark RC发布失败,原因是;无法初始化类sun.util.calendar.ZoneInfo文件";?,maven,apache-spark,sbt,zinc,Maven,Apache Spark,Sbt,Zinc,我正在尝试使用mvn构建Spark 2.2.0-rc2发行版,但无法做到这一点 $ uname -a Linux knoldus-Vostro-15-3568 4.4.0-46-generic #67-Ubuntu SMP Thu Oct 20 15:05:12 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux $ java -version openjdk version "1.8.0_131" 下面是我得到的错误堆栈: $ ./build/mvn -Phad

我正在尝试使用
mvn
构建Spark 2.2.0-rc2发行版,但无法做到这一点

$ uname -a
Linux knoldus-Vostro-15-3568 4.4.0-46-generic #67-Ubuntu SMP Thu Oct 20 15:05:12 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux

$ java -version
openjdk version "1.8.0_131"
下面是我得到的错误堆栈:

$ ./build/mvn -Phadoop-2.7,yarn,mesos,hive,hive-thriftserver -DskipTests clean install

...
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ spark-tags_2.11 ---
[INFO] Using zinc server for incremental compilation
java.lang.NoClassDefFoundError: Could not initialize class sun.util.calendar.ZoneInfoFile
    at sun.util.calendar.ZoneInfo.getTimeZone(ZoneInfo.java:589)
    at java.util.TimeZone.getTimeZone(TimeZone.java:560)
    at java.util.TimeZone.setDefaultZone(TimeZone.java:666)
    at java.util.TimeZone.getDefaultRef(TimeZone.java:636)
    at java.util.Date.<init>(Date.java:254)
    at java.util.zip.ZipUtils.dosToJavaTime(ZipUtils.java:71)
    at java.util.zip.ZipUtils.extendedDosToJavaTime(ZipUtils.java:88)
    at java.util.zip.ZipEntry.getTime(ZipEntry.java:194)
    at sbt.IO$.next$1(IO.scala:278)
    at sbt.IO$.sbt$IO$$extract(IO.scala:286)
    at sbt.IO$$anonfun$unzipStream$1.apply(IO.scala:255)
    at sbt.IO$$anonfun$unzipStream$1.apply(IO.scala:255)
    at sbt.Using.apply(Using.scala:24)
    at sbt.IO$.unzipStream(IO.scala:255)
    at sbt.IO$$anonfun$unzip$1.apply(IO.scala:249)
    at sbt.IO$$anonfun$unzip$1.apply(IO.scala:249)
    at sbt.Using.apply(Using.scala:24)
    at sbt.IO$.unzip(IO.scala:249)
    at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1$$anonfun$5.apply(AnalyzingCompiler.scala:140)
    at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1$$anonfun$5.apply(AnalyzingCompiler.scala:140)
    at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:111)
    at scala.collection.immutable.List.foldLeft(List.scala:84)
    at scala.collection.TraversableOnce$class.$div$colon(TraversableOnce.scala:138)
    at scala.collection.AbstractTraversable.$div$colon(Traversable.scala:105)
    at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1.apply(AnalyzingCompiler.scala:140)
    at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1.apply(AnalyzingCompiler.scala:139)
    at sbt.IO$.withTemporaryDirectory(IO.scala:344)
    at sbt.compiler.AnalyzingCompiler$.compileSources(AnalyzingCompiler.scala:139)
    at sbt.compiler.IC$.compileInterfaceJar(IncrementalCompiler.scala:58)
    at com.typesafe.zinc.Compiler$.compilerInterface(Compiler.scala:148)
    at com.typesafe.zinc.Compiler$.create(Compiler.scala:53)
    at com.typesafe.zinc.Compiler$$anonfun$apply$1.apply(Compiler.scala:40)
    at com.typesafe.zinc.Compiler$$anonfun$apply$1.apply(Compiler.scala:40)
    at com.typesafe.zinc.Cache.get(Cache.scala:41)
    at com.typesafe.zinc.Compiler$.apply(Compiler.scala:40)
    at com.typesafe.zinc.Main$.run(Main.scala:96)
    at com.typesafe.zinc.Nailgun$.zinc(Nailgun.scala:93)
    at com.typesafe.zinc.Nailgun$.nailMain(Nailgun.scala:82)
    at com.typesafe.zinc.Nailgun.nailMain(Nailgun.scala)
    at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at com.martiansoftware.nailgun.NGSession.run(NGSession.java:280)
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Spark Project Parent POM ........................... SUCCESS [  5.657 s]
[INFO] Spark Project Tags ................................. FAILURE [  0.371 s]
[INFO] Spark Project Sketch ............................... SKIPPED
[INFO] Spark Project Networking ........................... SKIPPED
[INFO] Spark Project Shuffle Streaming Service ............ SKIPPED
[INFO] Spark Project Unsafe ............................... SKIPPED
[INFO] Spark Project Launcher ............................. SKIPPED
[INFO] Spark Project Core ................................. SKIPPED
[INFO] Spark Project ML Local Library ..................... SKIPPED
[INFO] Spark Project GraphX ............................... SKIPPED
[INFO] Spark Project Streaming ............................ SKIPPED
[INFO] Spark Project Catalyst ............................. SKIPPED
[INFO] Spark Project SQL .................................. SKIPPED
[INFO] Spark Project ML Library ........................... SKIPPED
[INFO] Spark Project Tools ................................ SKIPPED
[INFO] Spark Project Hive ................................. SKIPPED
[INFO] Spark Project REPL ................................. SKIPPED
[INFO] Spark Project YARN Shuffle Service ................. SKIPPED
[INFO] Spark Project YARN ................................. SKIPPED
[INFO] Spark Project Mesos ................................ SKIPPED
[INFO] Spark Project Hive Thrift Server ................... SKIPPED
[INFO] Spark Project Assembly ............................. SKIPPED
[INFO] Spark Project External Flume Sink .................. SKIPPED
[INFO] Spark Project External Flume ....................... SKIPPED
[INFO] Spark Project External Flume Assembly .............. SKIPPED
[INFO] Spark Integration for Kafka 0.8 .................... SKIPPED
[INFO] Spark Project Examples ............................. SKIPPED
[INFO] Spark Project External Kafka Assembly .............. SKIPPED
[INFO] Spark Integration for Kafka 0.10 ................... SKIPPED
[INFO] Spark Integration for Kafka 0.10 Assembly .......... SKIPPED
[INFO] Kafka 0.10 Source for Structured Streaming ......... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 6.855 s
[INFO] Finished at: 2017-05-30T13:47:02+05:30
[INFO] Final Memory: 50M/605M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-tags_2.11: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed.: CompileFailed -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :spark-tags_2.11
语言环境为LANG=en_IN LANGUAGE=en_IN:en LC_CTYPE=“en_IN”LC_NUMERIC=“en_IN”LC_TIME=“en_IN”LC_COLLATE=“en_IN”LC_MONETARY=“en_IN”LC_MESSAGES=“en_IN”LC_PAPER=“en_IN”LC_NAME=“en_IN”LC_ADDRESS=“en_IN”LC_TELEPHONE=“en_IN”LC_电话=“en_IN”LC_度量=“en_IN”LC_IN”LC_标识=“en_IN”LC_=

我想问题在于你的时区。导出
LC_ALL=en_US.UTF-8
并重新开始。确保所有条目的
en_US.UTF-8
位于
locale

$ locale
LANG="en_US.UTF-8"
LC_COLLATE="en_US.UTF-8"
LC_CTYPE="en_US.UTF-8"
LC_MESSAGES="en_US.UTF-8"
LC_MONETARY="en_US.UTF-8"
LC_NUMERIC="en_US.UTF-8"
LC_TIME="en_US.UTF-8"
LC_ALL="en_US.UTF-8"

我也有类似的问题,原因是java安装中缺少时区信息文件(在将Java8更新到新版本之后)

当我查找
tzdb.dat
文件时,只有指向丢失目标的链接

我能够在RedHat上解决它,只需:

yum-update-tzdata-java

这个网站在这里很有帮助:

对于其他OS es,解决方案可能类似(Oracle甚至提供了时区更新工具:)

$ locale
LANG="en_US.UTF-8"
LC_COLLATE="en_US.UTF-8"
LC_CTYPE="en_US.UTF-8"
LC_MESSAGES="en_US.UTF-8"
LC_MONETARY="en_US.UTF-8"
LC_NUMERIC="en_US.UTF-8"
LC_TIME="en_US.UTF-8"
LC_ALL="en_US.UTF-8"