Java 安装后试图打开spark,但出现错误:找不到任何与版本匹配的JVM;1.8“;

Java 安装后试图打开spark,但出现错误:找不到任何与版本匹配的JVM;1.8“;,java,macos,scala,apache-spark,homebrew-cask,Java,Macos,Scala,Apache Spark,Homebrew Cask,说明: 我使用自制软件在MacBook上安装了spark。我遵循以下说明过程: 一步一步的过程包括安装Java,然后是Scala,然后是Spark。Java和Scala安装成功。Spark也成功安装 当我尝试使用下面的输入命令验证spark安装时,我遇到了一个错误 输入命令:spark shell 预期行为:预期火花在终端上启动 实际行为:我得到以下错误: Unable to find any JVMs matching version "1.8". WARNING: An illegal re

说明:

我使用自制软件在MacBook上安装了spark。我遵循以下说明过程:

一步一步的过程包括安装Java,然后是Scala,然后是Spark。Java和Scala安装成功。Spark也成功安装

当我尝试使用下面的输入命令验证spark安装时,我遇到了一个错误

输入命令:
spark shell

预期行为:预期火花在终端上启动

实际行为:我得到以下错误:

Unable to find any JVMs matching version "1.8".
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/usr/local/Cellar/apache-spark/2.4.5/libexec/jars/spark-unsafe_2.11-2.4.5.jar) to method java.nio.Bits.unaligned()
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Exception in thread "main" java.lang.ExceptionInInitializerError
    at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:80)
    at org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java:611)
    at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:273)
    at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:261)
    at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:791)
    at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:761)
    at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:634)
    at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2422)
    at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2422)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2422)
    at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:79)
    at org.apache.spark.deploy.SparkSubmit.secMgr$lzycompute$1(SparkSubmit.scala:348)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$secMgr$1(SparkSubmit.scala:348)
    at org.apache.spark.deploy.SparkSubmit$$anonfun$prepareSubmitEnvironment$7.apply(SparkSubmit.scala:356)
    at org.apache.spark.deploy.SparkSubmit$$anonfun$prepareSubmitEnvironment$7.apply(SparkSubmit.scala:356)
    at scala.Option.map(Option.scala:146)
    at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:355)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:774)
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.StringIndexOutOfBoundsException: begin 0, end 3, length 2
    at java.base/java.lang.String.checkBoundsBeginEnd(String.java:3756)
    at java.base/java.lang.String.substring(String.java:1902)
    at org.apache.hadoop.util.Shell.<clinit>(Shell.java:52)

以前的JAVA_主路径是
/opt/anaconda3
。我可以看到JAVA_HOME被更改为
usr/local/opt/JAVA


我仍然得到错误。感谢您的回答/反馈。谢谢

我在网上搜索,看到了安装pyspark的说明

我在终端上运行了这个命令
pip install pyspark

安装pyspark后,spark和pyspark都在运行

不知道发生了什么,但我现在可以运行spark了


感谢@Elliott的互动和指导

您是否安装了较新版本的Java?
/usr/local/opt/java/bin/java-version
的结果是什么?是的,作为spark安装的一部分,我安装了较新版本的java。奇怪<代码>/usr/local/opt/java/bin/java-version告诉我:openjdk版本“13.0.2”2020-01-14 openjdk运行时环境(构建13.0.2+8)openjdk 64位服务器VM(构建13.0.2+8,混合模式,共享)。当我在默认提示符下输入
java-version
时,我得到了如下结果:openjdk版本“1.8.0_152-release”openjdk运行时环境(build 1.8.0_152-release-1056-b12)openjdk 64位服务器VM(build 25.152-b12,混合模式)正常。现在尝试一下
which java
当我尝试'which
which java
时,我得到了这个:/Users/skanda_work/opt/anaconda3/bin/java
export java_HOME=/Users/skanda_work/opt/anaconda3
同意,
pip-install
可以通过
conda-install
或在MacBook Pro上运行。
export JAVA_HOME=/usr/local/opt/java