Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/330.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 本地运行的配置单元包括LZO的本机库_Java_Hadoop_Hive_Lzo - Fatal编程技术网

Java 本地运行的配置单元包括LZO的本机库

Java 本地运行的配置单元包括LZO的本机库,java,hadoop,hive,lzo,Java,Hadoop,Hive,Lzo,我尝试在OSX Mountain Lion上本地运行Hive,并尝试按照以下说明操作: 我已经编译了本机OSX库和jar,但我不确定如何在本地启动Hive,以便Hive/Hadoop使用本机库 我已经尝试通过JAVA_LIBRARY_PATH环境变量将其包括在内,但我认为这只适用于Hadoop export JAVA_LIBRARY_PATH="${SCRIPTS_DIR}/jars/native/Mac_OS_X-x86_64-64" 当我使用LzopCodec运行hive时,例如: SE

我尝试在OSX Mountain Lion上本地运行Hive,并尝试按照以下说明操作:

我已经编译了本机OSX库和jar,但我不确定如何在本地启动Hive,以便Hive/Hadoop使用本机库

我已经尝试通过JAVA_LIBRARY_PATH环境变量将其包括在内,但我认为这只适用于Hadoop

export JAVA_LIBRARY_PATH="${SCRIPTS_DIR}/jars/native/Mac_OS_X-x86_64-64"
当我使用LzopCodec运行hive时,例如:

SET mapred.output.compression.codec = com.hadoop.compression.lzo.LzopCodec;
当我运行运行map/reduce作业的查询时,出现以下错误:

SELECT COUNT(*) from test_table;


Job running in-process (local Hadoop)
org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: native-lzo library not available
        at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:237)
        at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:477)
        at org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:525)
        at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471)
        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:762)
        at org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84)
        at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471)
        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:762)
        at org.apache.hadoop.hive.ql.exec.GroupByOperator.forward(GroupByOperator.java:959)
        at org.apache.hadoop.hive.ql.exec.GroupByOperator.closeOp(GroupByOperator.java:995)
        at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:557)
        at org.apache.hadoop.hive.ql.exec.ExecReducer.close(ExecReducer.java:303)
        at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:530)
        at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:421)
        at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:262)
Caused by: java.lang.RuntimeException: native-lzo library not available
        at com.hadoop.compression.lzo.LzoCodec.getCompressorType(LzoCodec.java:155)
        at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:100)
        at com.hadoop.compression.lzo.LzopCodec.getCompressor(LzopCodec.java:135)
        at com.hadoop.compression.lzo.LzopCodec.createOutputStream(LzopCodec.java:70)
        at org.apache.hadoop.hive.ql.exec.Utilities.createCompressedStream(Utilities.java:868)
        at org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat.getHiveRecordWriter(HiveIgnoreKeyTextOutputFormat.java:80)
        at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:246)
        at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:234)
        ... 14 more
我还尝试在配置单元脚本中设置mapred.child.env LD_LIBRARY_路径(运气不好):


再次阅读清晰的说明:

如何配置Hadoop以使用这些类

基本上,我只需要将构建的本机库复制到hadoop安装中:

ant compile-native tar
cp -r build/hadoop-lzo-0.4.17-SNAPSHOT/lib/native/Mac_OS_X-x86_64-64 /usr/local/Cellar/hadoop/1.1.2/libexec/lib/native/
# Copy the native library
tar -cBf - -C build/hadoop-gpl-compression-0.1.0-dev/lib/native . | tar -xBvf - -C /path/to/hadoop/dist/lib/native
ant compile-native tar
cp -r build/hadoop-lzo-0.4.17-SNAPSHOT/lib/native/Mac_OS_X-x86_64-64 /usr/local/Cellar/hadoop/1.1.2/libexec/lib/native/