Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Hadoop“;无法为您的平台加载本机hadoop库”;docker spark上的错误?_Hadoop_Apache Spark_Docker - Fatal编程技术网

Hadoop“;无法为您的平台加载本机hadoop库”;docker spark上的错误?

Hadoop“;无法为您的平台加载本机hadoop库”;docker spark上的错误?,hadoop,apache-spark,docker,Hadoop,Apache Spark,Docker,我正在使用。启动火花壳后,输出: 15/05/21 04:28:22 DEBUG NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError:no hadoop in java.library.path 15/05/21 04:28:22 DEBUG NativeCodeLoader: java.library.path=/usr/java/packages/lib/amd64

我正在使用。启动
火花壳后
,输出:

15/05/21 04:28:22 DEBUG NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError:no hadoop in java.library.path
15/05/21 04:28:22 DEBUG NativeCodeLoader: java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
此火花容器的环境变量为:

bash-4.1# export
declare -x BOOTSTRAP="/etc/bootstrap.sh"
declare -x HADOOP_COMMON_HOME="/usr/local/hadoop"
declare -x HADOOP_CONF_DIR="/usr/local/hadoop/etc/hadoop"
declare -x HADOOP_HDFS_HOME="/usr/local/hadoop"
declare -x HADOOP_MAPRED_HOME="/usr/local/hadoop"
declare -x HADOOP_PREFIX="/usr/local/hadoop"
declare -x HADOOP_YARN_HOME="/usr/local/hadoop"
declare -x HOME="/"
declare -x HOSTNAME="sandbox"
declare -x JAVA_HOME="/usr/java/default"
declare -x OLDPWD
declare -x PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/java/default/bin:/usr/local/spark/bin:/usr/local/hadoop/bin"
declare -x PWD="/"
declare -x SHLVL="3"
declare -x SPARK_HOME="/usr/local/spark"
declare -x SPARK_JAR="hdfs:///spark/spark-assembly-1.3.0-hadoop2.4.0.jar"
declare -x TERM="xterm"
declare -x YARN_CONF_DIR="/usr/local/hadoop/etc/hadoop"
参考后,我做了以下工作:

(1) 检查
hadoop
库:

bash-4.1# file /usr/local/hadoop/lib/native/libhadoop.so.1.1.0
/usr/local/hadoop/lib/native/libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped
是的,它是
64位

(2) 尝试添加
HADOOP\u OPTS
环境变量:

export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/native"
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
它不工作,并报告相同的错误

(3) 尝试添加
HADOOP\u OPTS
HADOOP\u COMMON\u LIB\u NATIVE\u DIR
环境变量:

export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/native"
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
它仍然不工作,并报告相同的错误


有人能提供一些有关此问题的线索吗?

Hadoop
库添加到
LD\u库路径中
修复此问题:

export LD_LIBRARY_PATH="$HADOOP_HOME/lib/native/:$LD_LIBRARY_PATH"

谢谢,这解决了我的问题。我想知道这是怎么发生的,因为我在一个干净的Spark安装后没有得到错误…奇怪的是,Spark文档()中没有提到它,但它是有效的!谢谢