Hadoop pig脚本在色调浏览器上失败

Hadoop pig脚本在色调浏览器上失败,hadoop,hue,Hadoop,Hue,我有以下的猪脚本: data = LOAD '/user/test/text.txt' as (text:CHARARRAY) ; DUMP data; 当我在SHELL中运行它时,它可以工作,但在色调浏览器中运行时失败 这是输出日志: 2015-08-19 15:42:21,033 [JobControl] ERROR com.hadoop.compression.lzo.GPLNativeCodeLoader - Could not load native gpl library ja

我有以下的猪脚本:

data = LOAD '/user/test/text.txt' as (text:CHARARRAY) ;

DUMP data;
当我在SHELL中运行它时,它可以工作,但在色调浏览器中运行时失败

这是输出日志:

2015-08-19 15:42:21,033 [JobControl] ERROR com.hadoop.compression.lzo.GPLNativeCodeLoader  - Could not load native gpl library
java.lang.UnsatisfiedLinkError: no gplcompression in java.library.path
    at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1886)
    at java.lang.Runtime.loadLibrary0(Runtime.java:849)
    at java.lang.System.loadLibrary(System.java:1088)
    at com.hadoop.compression.lzo.GPLNativeCodeLoader.<clinit>(GPLNativeCodeLoader.java:32)
    at com.hadoop.compression.lzo.LzoCodec.<clinit>(LzoCodec.java:71)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:270)
    at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:2051)
    at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2016)
    at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:128)
    at org.apache.hadoop.io.compress.CompressionCodecFactory.<init>(CompressionCodecFactory.java:175)
    at org.apache.hadoop.mapreduce.lib.input.TextInputFormat.isSplitable(TextInputFormat.java:58)
    at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:397)
    at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:274)
    at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:597)
    at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:614)
    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:492)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1306)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1303)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1303)
    at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.pig.backend.hadoop23.PigJobControl.submit(PigJobControl.java:128)
    at org.apache.pig.backend.hadoop23.PigJobControl.run(PigJobControl.java:191)
    at java.lang.Thread.run(Thread.java:745)
2015-08-19 15:42:21033[JobControl]错误com.hadoop.compression.lzo.GPLNativeCodeLoader-无法加载本机gpl库
java.lang.UnsatifiedLink错误:java.library.path中没有gplcompression
位于java.lang.ClassLoader.loadLibrary(ClassLoader.java:1886)
位于java.lang.Runtime.loadLibrary0(Runtime.java:849)
位于java.lang.System.loadLibrary(System.java:1088)
位于com.hadoop.compression.lzo.GPLNativeCodeLoader(GPLNativeCodeLoader.java:32)
在com.hadoop.compression.lzo.LzoCodec上。(LzoCodec.java:71)
位于java.lang.Class.forName0(本机方法)
位于java.lang.Class.forName(Class.java:270)
位于org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:2051)
位于org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2016)
位于org.apache.hadoop.io.compress.compressionCodeFactory.getCodecClasses(CompressionCodecFactory.java:128)
位于org.apache.hadoop.io.compress.CompressionCodeFactory。(compressionCodeFactory.java:175)
位于org.apache.hadoop.mapreduce.lib.input.TextInputFormat.isSplitable(TextInputFormat.java:58)
位于org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:397)
位于org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:274)
位于org.apache.hadoop.mapreduce.jobsmitter.writeNewSplits(jobsmitter.java:597)
位于org.apache.hadoop.mapreduce.jobsmitter.writeSplits(jobsmitter.java:614)
位于org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:492)
位于org.apache.hadoop.mapreduce.Job$10.run(Job.java:1306)
位于org.apache.hadoop.mapreduce.Job$10.run(Job.java:1303)
位于java.security.AccessController.doPrivileged(本机方法)
位于javax.security.auth.Subject.doAs(Subject.java:415)
位于org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
位于org.apache.hadoop.mapreduce.Job.submit(Job.java:1303)
位于org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处
在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)中
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)中
位于java.lang.reflect.Method.invoke(Method.java:606)
位于org.apache.pig.backend.hadoop23.PigJobControl.submit(PigJobControl.java:128)
位于org.apache.pig.backend.hadoop23.PigJobControl.run(PigJobControl.java:191)
运行(Thread.java:745)
任何人以前都遇到过这个错误,请帮助我


谢谢

Hue使用Oozie提交Pig脚本,Oozie将从集群中的节点运行它。lzo是否正确安装在那里?在我的情况下,所有服务都安装在一个服务器datanode+Namdode+hue…,请注意,我如何检查?我使用命令rpm-qa 124;grep-grep-lzo,这里是结果:hadoop-lzo-0.4.15+cdh5.4.5+5.5+0.5+0-1.5+0.5+0.5+5.5+cdhh5.5+5+5.5+5+5+cdh5.5+5+5+5+5.5+5+5+0-5.5+0-5.5+0-1.5+1.5.5+0-5.5+0-1.5+1.5.5.5+0.5+0-1.5+0.5.5+0.5.5.5+0.5.5+1.5.5+0.5.5.5+1.5.5.5+1.5+0.5.5+1.5.5+0.5.5.5.5.3.3.3.3.3 1.5.p0.8.el6.x86_64 lzo-devel-2.03-3.1.el6_5.1.x86_64hadoop-lzo-mr1-0.4.15+cdh5.4.5+0-1.cdh5.4.5.p0.8.el6.x86_64 lzo-minilzo-2.03-3.1.el6_5.1.x86_64I修复了这里的问题:sudo-uoozie-hadoop fs-put/usr/lib/hadoop/lib/native/*/user/oozie/share/lib/lib/20150730103317/pig/