Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java Apache Spark:加载HDFS拼花文件会导致SLF4J、BoneCP错误_Java_Hadoop_Apache Spark - Fatal编程技术网

Java Apache Spark:加载HDFS拼花文件会导致SLF4J、BoneCP错误

Java Apache Spark:加载HDFS拼花文件会导致SLF4J、BoneCP错误,java,hadoop,apache-spark,Java,Hadoop,Apache Spark,我是这样运行Pypark的: IPYTHON=1 pyspark --master 'local[32]' --driver-class-path '/opt/spark_jars/*' 其中/opt/spark_jars/包含BoneCP、SLF4J和使用spark时需要的各种其他jar。我可以验证它们是否都成功地结束在spark驱动程序的类路径上 spark defaults.conf还包含以下行: spark.executor.extraClassPath /opt/spar

我是这样运行Pypark的:

IPYTHON=1 pyspark --master 'local[32]' --driver-class-path '/opt/spark_jars/*'
其中
/opt/spark_jars/
包含BoneCP、SLF4J和使用spark时需要的各种其他jar。我可以验证它们是否都成功地结束在spark驱动程序的类路径上

spark defaults.conf
还包含以下行:

spark.executor.extraClassPath       /opt/spark_connector/*
spark.driver.extraClassPath         /opt/spark_connector/*
但是,当我加载存储在HDFS中的拼花地板文件时会发生这种情况:

In [1]: data = sqlCtx.load('/user/spark/parquet/data')
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
15/09/23 10:52:57 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
15/09/23 10:52:58 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
就好像运行的是一个单独的Java进程,它不共享驱动程序或执行器类路径,但我找不到任何关于它的文档。Spark 1.3.1和Spark 1.5.0都会出现此问题