Apache spark 无法执行Spark Submit或Spark shell

Apache spark 无法执行Spark Submit或Spark shell,apache-spark,Apache Spark,在我的Hadoop集群上,每当我执行spark submit、spark shell或py spark时 它只是说无限 16/01/15 16:27:50 INFO Client: Application report for application_1452870745977_0005 (state: ACCEPTED) 16/01/15 16:27:51 INFO Client: Application report for application_1452870745977_0005

在我的Hadoop集群上,每当我执行spark submit、spark shell或py spark时

它只是说无限

16/01/15 16:27:50 INFO Client: Application report for 
application_1452870745977_0005 (state: ACCEPTED)
16/01/15 16:27:51 INFO Client: Application report for 
application_1452870745977_0005 (state: ACCEPTED)
16/01/15 16:27:52 INFO Client: Application report for 
application_1452870745977_0005 (state: ACCEPTED)
16/01/15 16:27:53 INFO Client: Application report for 
application_1452870745977_0005 (state: ACCEPTED)
16/01/15 16:27:54 INFO Client: Application report for 
application_1452870745977_0005 (state: ACCEPTED)
16/01/15 16:27:55 INFO Client: Application report for 
application_1452870745977_0005 (state: ACCEPTED)
16/01/15 16:27:56 INFO Client: Application report for 
application_1452870745977_0005 (state: ACCEPTED)
16/01/15 16:27:57 INFO Client: Application report for 
application_1452870745977_0005 (state: ACCEPTED)
16/01/15 16:27:58 INFO Client: Application report for 
application_1452870745977_0005 (state: ACCEPTED)
16/01/15 16:27:59 INFO Client: Application report for 
application_1452870745977_0005 (state: ACCEPTED)
16/01/15 16:28:00 INFO Client: Application report for 
application_1452870745977_0005 (state: ACCEPTED)
16/01/15 16:28:01 INFO Client: Application report for 
application_1452870745977_0005 (state: ACCEPTED)
16/01/15 16:28:02 INFO Client: Application report for 
application_1452870745977_0005 (state: ACCEPTED)

我等了很长时间,但这条信息一直没有消失。。。有人知道火花壳出了什么问题吗?

这个问题已经解决了。我的管理员告诉我,许多hadoop节点由于非常大的日志文件而耗尽了本地磁盘空间


修剪日志文件并释放空间后,spark重新开始工作。

确保集群中有足够的资源(mem/cpu)来运行应用程序,您正在使用哪个版本的spark。?我在cloudera 5.4.1上使用1.3.0