Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/379.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 如何避免macos中的断串类错误?_Java_Scala_Apache Spark - Fatal编程技术网

Java 如何避免macos中的断串类错误?

Java 如何避免macos中的断串类错误?,java,scala,apache-spark,Java,Scala,Apache Spark,我有一个scala spark项目,我正试图在我的mac上运行 在sbt中启动项目时,我得到以下输出: lpuggini-pro13:stackoverflow lpuggini$ sbt [residual] arg = '-sbt-create' [process_args] java_version = '13' [sbt_options] declare -a sbt_options='()' [addMemory] arg = '1024' [addJava] arg = '-Xms

我有一个scala spark项目,我正试图在我的mac上运行

在sbt中启动项目时,我得到以下输出:

lpuggini-pro13:stackoverflow lpuggini$ sbt 
[residual] arg = '-sbt-create'
[process_args] java_version = '13'
[sbt_options] declare -a sbt_options='()'
[addMemory] arg = '1024'
[addJava] arg = '-Xms1024m'
[addJava] arg = '-Xmx1024m'
[addJava] arg = '-Xss4M'
[addJava] arg = '-XX:ReservedCodeCacheSize=128m'
[copyRt] java9_rt = '/Users/lpuggini/.sbt/0.13/java9-rt-ext-adoptopenjdk_13/rt.jar'
[addJava] arg = '-Dscala.ext.dirs=/Users/lpuggini/.sbt/0.13/java9-rt-ext-adoptopenjdk_13'
# Executing command line:
java
-Dfile.encoding=UTF-8
-Xms1024m
-Xmx1024m
-Xss4M
-XX:ReservedCodeCacheSize=128m
-Dscala.ext.dirs=/Users/lpuggini/.sbt/0.13/java9-rt-ext-adoptopenjdk_13
-jar
/usr/local/Cellar/sbt/1.3.1/libexec/bin/sbt-launch.jar

[info] Loading project definition from /Users/lpuggini/ProgrammingProjects/spark_coursera/stackoverflow/project
[info] Compiling 8 Scala sources to /Users/lpuggini/ProgrammingProjects/spark_coursera/stackoverflow/project/target/scala-2.10/sbt-0.13/classes...
[warn] /Users/lpuggini/ProgrammingProjects/spark_coursera/stackoverflow/project/CommonBuild.scala:3: trait Build in package sbt is deprecated: Use .sbt format instead
[warn] trait CommonBuild extends Build {
[warn]                           ^
[warn] one warning found
error: error while loading String, class file '/Library/Java/JavaVirtualMachines/adoptopenjdk-13.jdk/Contents/Home(java/lang/String.class)' is broken
(class java.lang.NullPointerException/null)
[info] Set current project to bigdata-stackoverflow (in build file:/Users/lpuggini/ProgrammingProjects/spark_coursera/stackoverflow/)
> 
带有错误消息
错误:加载字符串时出错,类文件“/Library/Java/JavaVirtualMachines/adoptopenjdk-13.jdk/Contents/Home(Java/lang/String.class)”已损坏

我认为这会导致代码崩溃

> console
[info] Compiling 2 Scala sources to /Users/lpuggini/ProgrammingProjects/spark_coursera/stackoverflow/target/scala-2.11/classes...
[info] Starting scala interpreter...
[info] 
Welcome to Scala 2.11.12 (OpenJDK 64-Bit Server VM, Java 13).
Type in expressions for evaluation. Or try :help.

scala> import org.apache.spark.SparkConf
import org.apache.spark.SparkConf

scala> import org.apache.spark.SparkContext
import org.apache.spark.SparkContext

scala> import org.apache.spark.SparkContext._
import org.apache.spark.SparkContext._

scala> import org.apache.spark.rdd.RDD
import org.apache.spark.rdd.RDD

scala> import annotation.tailrec
import annotation.tailrec

scala> import scala.reflect.ClassTag
import scala.reflect.ClassTag

scala> val conf: SparkConf = new SparkConf().setMaster("local").setAppName("StackOverflow")
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/Users/lpuggini/Library/Caches/Coursier/v1/https/repo1.maven.org/maven2/org/apache/spark/spark-unsafe_2.11/2.4.3/spark-unsafe_2.11-2.4.3.jar) to method java.nio.Bits.unaligned()
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
conf: org.apache.spark.SparkConf = org.apache.spark.SparkConf@33f9678b

scala> val sc: SparkContext = new SparkContext(conf)
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
19/09/22 15:56:28 INFO SparkContext: Running Spark version 2.4.3
19/09/22 15:56:28 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
java.lang.StringIndexOutOfBoundsException: begin 0, end 3, length 2
  at java.base/java.lang.String.checkBoundsBeginEnd(String.java:3720)
  at java.base/java.lang.String.substring(String.java:1909)
  at org.apache.hadoop.util.Shell.<clinit>(Shell.java:50)
  at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79)
  at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:116)
  at org.apache.hadoop.security.Groups.<init>(Groups.java:93)
  at org.apache.hadoop.security.Groups.<init>(Groups.java:73)
  at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:293)
  at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:283)
  at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:260)
  at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:789)
  at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:774)
  at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:647)
  at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2422)
  at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2422)
  at scala.Option.getOrElse(Option.scala:121)
  at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2422)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:293)
  ... 42 elided

scala> 
>控制台
[信息]将2个Scala源代码编译到/Users/lpuggini/ProgrammingProjects/spark_coursera/stackoverflow/target/Scala-2.11/classes。。。
[信息]正在启动scala解释器。。。
[信息]
欢迎使用Scala 2.11.12(OpenJDK 64位服务器虚拟机,Java13)。
键入要计算的表达式。或者尝试:帮助。
scala>import org.apache.spark.SparkConf
导入org.apache.spark.SparkConf
scala>导入org.apache.spark.SparkContext
导入org.apache.spark.SparkContext
scala>导入org.apache.spark.SparkContext_
导入org.apache.spark.SparkContext_
scala>导入org.apache.spark.rdd.rdd
导入org.apache.spark.rdd.rdd
scala>import annotation.tailrec
导入annotation.tailrec
scala>导入scala.reflect.ClassTag
导入scala.reflect.ClassTag
scala>val conf:SparkConf=new SparkConf().setMaster(“本地”).setAppName(“堆栈溢出”)
警告:发生了非法的反射访问操作
警告:org.apache.spark.unsafe.Platform(文件:/Users/lpuggini/Library/Caches/Coursier/v1/https/repo1.maven.org/maven2/org/apache/spark/spark-unsafe_2.11/2.4.3/spark-unsafe_2.11-2.4.3.jar)对方法java.nio.Bits.unaligned()的非法反射访问
警告:请考虑将此报告给Or.ApHEC.SPARK.UNSAFE平台的维护者。
警告:使用--invalize access=warn以启用对进一步非法访问操作的警告
警告:所有非法访问操作将在未来版本中被拒绝
conf:org.apache.spark.SparkConf=org.apache.spark。SparkConf@33f9678b
scala>val sc:SparkContext=新的SparkContext(conf)
使用Spark的默认log4j配置文件:org/apache/Spark/log4j-defaults.properties
19/09/22 15:56:28信息SparkContext:运行Spark版本2.4.3
19/09/22 15:56:28警告NativeCodeLoader:无法为您的平台加载本机hadoop库。。。在适用的情况下使用内置java类
java.lang.StringIndexOutOfBoundsException:开始0,结束3,长度2
位于java.base/java.lang.String.checkBoundsBeginEnd(String.java:3720)
位于java.base/java.lang.String.substring(String.java:1909)
位于org.apache.hadoop.util.Shell(Shell.java:50)
位于org.apache.hadoop.util.StringUtils。(StringUtils.java:79)
位于org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:116)
位于org.apache.hadoop.security.Groups.(Groups.java:93)
位于org.apache.hadoop.security.Groups.(Groups.java:73)
位于org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:293)
位于org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:283)
位于org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:260)
位于org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:789)
位于org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:774)
位于org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:647)
位于org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2422)
位于org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2422)
位于scala.Option.getOrElse(Option.scala:121)
位于org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2422)
位于org.apache.spark.SparkContext(SparkContext.scala:293)
... 42删去
斯卡拉>

我怎样才能修好它

我认为问题在于您使用的是JDK 13,请尝试使用JDK 8。谢谢,您知道如何更改它吗?只需安装JDK 8并更改
JAVA_HOME
env变量即可。你可以在谷歌上找到很多关于这方面的信息。因此,我用
brew cask安装采用OpenJDK8
安装了java8,但现在我不知道如何设置
JAVA\u HOME
。我尝试过:
export JAVA_HOME=/Library/JAVA/JavaVirtualMachines/adoptopenjdk-8.jdk/Contents/HOME/
但是
sbt
仍在使用
JAVA 13
可能需要编辑sbt启动脚本: