Apache spark 启动Spark历史记录服务器时发生异常

Apache spark 启动Spark历史记录服务器时发生异常,apache-spark,apache-spark-sql,Apache Spark,Apache Spark Sql,我一直在尝试启动Spark历史服务器。我在spark defaults.conf中有以下设置 spark.eventLog.enabled true spark.eventLog.dir /home/user/Pictures spark.history.fs.logDirectory /home/user/Pictures 每当我运行start history server.sh时,都会出现以下异常: 20/03/06 12:03:11 IN

我一直在尝试启动Spark历史服务器。我在
spark defaults.conf中有以下设置

spark.eventLog.enabled           true
spark.eventLog.dir              /home/user/Pictures
spark.history.fs.logDirectory   /home/user/Pictures
每当我运行
start history server.sh
时,都会出现以下异常:

20/03/06 12:03:11 INFO history.FsHistoryProvider: History server ui acls disabled; users with admin permissions: ; groups with admin permissions
Exception in thread "main" java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.spark.deploy.history.HistoryServer$.main(HistoryServer.scala:296)
    at org.apache.spark.deploy.history.HistoryServer.main(HistoryServer.scala)
Caused by: java.io.FileNotFoundException: Log directory specified does not exist: /home/user/Pictures
    at org.apache.spark.deploy.history.FsHistoryProvider.org$apache$spark$deploy$history$FsHistoryProvider$$startPolling(FsHistoryProvider.scala:267)
    at org.apache.spark.deploy.history.FsHistoryProvider.initialize(FsHistoryProvider.scala:211)
    at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:207)
    at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:86)
    ... 6 more
Caused by: java.io.FileNotFoundException: File does not exist: /home/user/Pictures
    at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1122)
    at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
    at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
    at org.apache.spark.deploy.history.FsHistoryProvider.org$apache$spark$deploy$history$FsHistoryProvider$$startPolling(FsHistoryProvider.scala:257)
    ... 9 more
20/03/06 12:03:11 INFO history.FsHistoryProvider:历史服务器ui ACL已禁用;具有管理员权限的用户:;具有管理员权限的组
线程“main”java.lang.reflect.InvocationTargetException中出现异常
位于sun.reflect.NativeConstructorAccessorImpl.newInstance0(本机方法)
位于sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
在sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
位于java.lang.reflect.Constructor.newInstance(Constructor.java:423)
位于org.apache.spark.deploy.history.HistoryServer$.main(HistoryServer.scala:296)
位于org.apache.spark.deploy.history.HistoryServer.main(HistoryServer.scala)
原因:java.io.FileNotFoundException:指定的日志目录不存在:/home/user/Pictures
在org.apache.spark.deploy.history.FsHistoryProvider.org$apache$spark$deploy$history$FsHistoryProvider$$startPolling(FsHistoryProvider.scala:267)
位于org.apache.spark.deploy.history.FsHistoryProvider.initialize(FsHistoryProvider.scala:211)
位于org.apache.spark.deploy.history.FsHistoryProvider。(FsHistoryProvider.scala:207)
位于org.apache.spark.deploy.history.FsHistoryProvider。(FsHistoryProvider.scala:86)
... 还有6个
原因:java.io.FileNotFoundException:文件不存在:/home/user/Pictures
位于org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1122)
位于org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
位于org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
位于org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
位于org.apache.spark.deploy.history.FsHistoryProvider.org$apache$spark$deploy$history$FsHistoryProvider$$startPolling(FsHistoryProvider.scala:257)
... 9更多
目录
/home/user/Pictures
确实存在。Spark似乎总是在HDFS上查找所需的目录。我还为spark单机模式设置了HDFS,但目前正在
local[]
模式下运行spark应用程序


有人能帮我解决这个问题吗?

如果是本地的,请尝试在前面加前缀
file:///home/user/Pictures>
在事件日志和日志目录中。希望这有帮助

您能告诉我为什么需要添加
文件://
?以前(在我安装HDFS之前),我只是在没有前缀的情况下输入它,它就正常工作了。这就是语法,我可以根据您提供的任何信息建议解决方案。这可能是其无法获取位置的原因。请先对其进行测试,如果问题仍然存在,请返回。。