Apache spark spark错误-获取安装错误或
安装Spark并运行后Apache spark spark错误-获取安装错误或,apache-spark,pyspark,apache-spark-sql,Apache Spark,Pyspark,Apache Spark Sql,安装Spark并运行后 C:\spark-2.3.1-bin-hadoop2.7\bin>spark-shell 我收到以下错误-有什么建议吗 C:\spark-2.3.1-bin-hadoop2.7\bin>spark-shell WARNING: An illegal reflective access operation has occurred WARNING: Illegal reflective access by org.apache.hadoop.security.
C:\spark-2.3.1-bin-hadoop2.7\bin>spark-shell
我收到以下错误-有什么建议吗
C:\spark-2.3.1-bin-hadoop2.7\bin>spark-shell
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/C:/spark-2.3.1-bin-hadoop2.7/jars/hadoop-auth-2.7.3.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
2018-08-05 01:29:36 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Failed to initialize compiler: object java.lang.Object in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programmatically, settings.usejavacp.value = true.
Failed to initialize compiler: object java.lang.Object in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programmatically, settings.usejavacp.value = true.
Exception in thread "main" java.lang.NullPointerException
我认为您没有正确的java或scala版本 请注意,Spark 2.3.1运行于
Java 8+,
Python 2.7+/3.4+ and
R 3.1+.
对于Scala API,Spark 2.3.1使用Scala 2.11。您需要使用兼容的Scala版本(2.11.x)
请检查以下两件事-
1.检查提交spark应用程序的计算机上安装的java版本
sudo update-alternatives --config java
sudo update-alternatives --config javac
scala-版本
谢谢是-重新安装了Java/jdf 8-现在工作正常。