Apache spark 执行Spark shell时发生Apache Spark异常

Apache spark 执行Spark shell时发生Apache Spark异常,apache-spark,spark-streaming,Apache Spark,Spark Streaming,我已经在单个节点上安装了ApacheSpark。当我运行spark shell时,我得到以下异常。尽管出现异常,我仍然可以创建RDD并运行scala代码段 这是一个例外: 16/02/15 14:21:29 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException 16/02/15 14:21:31 WARN : Your hostname, Rahul-PC resolves to a l

我已经在单个节点上安装了ApacheSpark。当我运行spark shell时,我得到以下异常。尽管出现异常,我仍然可以创建RDD并运行scala代码段

这是一个例外:

16/02/15 14:21:29 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
16/02/15 14:21:31 WARN : Your hostname, Rahul-PC resolves to a loopback/non-reachable address: fe80:0:0:0:c0c1:cd2e:990d:17ac%e
java.lang.RuntimeException: java.lang.NullPointerException
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
        at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:171)
        at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:162)
        at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:160)
        at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:167)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:408)
        at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
        at $iwC$$iwC.<init>(<console>:9)
        at $iwC.<init>(<console>:18)
        at <init>(<console>:20)
        at .<init>(<console>:24)
        at .<clinit>(<console>)
        at .<init>(<console>:7)
        at .<clinit>(<console>)
        at $print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

还有什么我需要做的吗。请告知。

我找到了解决方案。Spark需要winutils.exe来初始化配置单元上下文。运行spark shell时创建的C:\Windows\tmp文件夹也需要有足够的权限


我找到了解决办法。Spark需要winutils.exe来初始化配置单元上下文。运行spark shell时创建的C:\Windows\tmp文件夹也需要有足够的权限


类似于这个问题:你什么时候收到这个?启动spark shell或在执行某些命令时?似乎类似于以下问题:您何时收到此消息?启动spark shell还是在执行某些命令时启动?
JAVA_HOME = C:\Program Files\Java\jdk1.8.0