Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Hadoop 在hdfs上使用mesos和spark软件包运行spark_Hadoop_Apache Spark_Hdfs_Mesos_Mesosphere - Fatal编程技术网

Hadoop 在hdfs上使用mesos和spark软件包运行spark

Hadoop 在hdfs上使用mesos和spark软件包运行spark,hadoop,apache-spark,hdfs,mesos,mesosphere,Hadoop,Apache Spark,Hdfs,Mesos,Mesosphere,我尝试用mesos运行pyspark,我的spark包被上传到本地可访问的hdfs 然而,每当我尝试启动shell时,总是出现以下错误 16/04/18 22:21:33 INFO CoarseMesosSchedulerBackend: Blacklisting Mesos slave a8509a2e-fad8-485d-850e-8243fd2ee449-S0 due to too many failures; is Spark installed on it? 我在spark-env.

我尝试用mesos运行pyspark,我的spark包被上传到本地可访问的hdfs

然而,每当我尝试启动shell时,总是出现以下错误

16/04/18 22:21:33 INFO CoarseMesosSchedulerBackend: Blacklisting Mesos slave a8509a2e-fad8-485d-850e-8243fd2ee449-S0 due to too many failures; is Spark installed on it?
我在spark-env.sh中添加了以下内容

export MESOS_NATIVE_JAVA_LIBRAR=/usr/local/lib/libmesos.so
export SPARK_EXECUTOR_URI=hdfs://ipaddress:9000/spark/spark-1.6.1-bin-hadoop2.6

但似乎奴隶仍然找不到spark包来执行。如何解决此问题?

您的mesos从设备是否配置了HADOOP\u HOME?您应该在hdfs中具有类似spark-1.6.1-bin-hadoop2.6.tar.gz的tar.gz文件。您的mesos从机是否配置了HADOOP_HOME?在hdfs中应该有类似spark-1.6.1-bin-hadoop2.6.tar.gz的tar.gz文件。