Apache spark Spark |纱线文件未找到异常

Apache spark Spark |纱线文件未找到异常,apache-spark,yarn,Apache Spark,Yarn,运行包含1个主节点和2个从节点的3节点CDH群集。我有一个用Java编写的web应用程序,它将spark作业提交给Thread。现在获取以下错误。Web应用程序是使用Tomcat部署的,Tomcat作为不同的操作系统用户运行 Application application_1502437323246_0010 failed 2 times due to AM Container for appattempt_1502437323246_0010_000002 exited with exitCo

运行包含1个主节点和2个从节点的3节点CDH群集。我有一个用Java编写的web应用程序,它将spark作业提交给Thread。现在获取以下错误。Web应用程序是使用Tomcat部署的,Tomcat作为不同的操作系统用户运行

Application application_1502437323246_0010 failed 2 times due to AM Container for appattempt_1502437323246_0010_000002 exited with exitCode: -1000
For more detailed output, check application tracking page:, click on links to logs of each attempt.
Diagnostics: File file:/home/user/tomcat/apache-tomcat-8.0.38/temp/spark-1692c53f-313a-41c1-9581-e716c244b7c8/__spark_libs__4041232999285325500.zip does not exist
java.io.FileNotFoundException: File file:/home/user/tomcat/apache-tomcat-8.0.38/temp/spark-1692c53f-313a-41c1-9581-e716c244b7c8/__spark_libs__4041232999285325500.zip does not exist
at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:598)
at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:811)
at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:588)
at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:425)
at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:251)
at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:61)
at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359)
at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:357)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:356)
at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:60)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
这次尝试失败了。应用程序失败

看起来工作节点没有访问上述文件位置的权限,理想情况下,这些文件应该在HDFS上创建,以便工作节点可以访问它

问题

1) 这些文件是什么?为什么要在tomcat的临时文件夹下创建它们? 2) 是否有可以在HDFS上创建这些文件的配置来解决上述错误 3) 在“客户端”部署模式下运行时还有其他注意事项吗

任何其他信息将是有用的,因为我是新的火花和HDFS。我使用的是CDH 5.12的默认配置以及Spark 2.1.0发行版