Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
在com.hadoop.compression.lzo.LzoCodec上的Faunus测试失败找不到HDP1.3_Hadoop_Lzo_Titan - Fatal编程技术网

在com.hadoop.compression.lzo.LzoCodec上的Faunus测试失败找不到HDP1.3

在com.hadoop.compression.lzo.LzoCodec上的Faunus测试失败找不到HDP1.3,hadoop,lzo,titan,Hadoop,Lzo,Titan,您好,我在HDP1.3上安装了Faunus 0.32 当我在中遵循get start测试用例时, 我犯了以下错误 gremlin> g = FaunusFactory.open('bin/faunus.properties') ==>faunusgraph[graphsoninputformat->graphsonoutputformat] gremlin> g.V.type.groupCount 13/09/29 21:38:49 WAR

您好,我在HDP1.3上安装了Faunus 0.32 当我在中遵循get start测试用例时, 我犯了以下错误

    gremlin> g = FaunusFactory.open('bin/faunus.properties')
    ==>faunusgraph[graphsoninputformat->graphsonoutputformat]
    gremlin> g.V.type.groupCount
    13/09/29 21:38:49 WARN mapreduce.FaunusCompiler: Using the distribution Faunus job jar: lib/faunus-0.3.2-job.jar
    13/09/29 21:38:49 INFO mapreduce.FaunusCompiler: Compiled to 1 MapReduce job(s)
    13/09/29 21:38:49 INFO mapreduce.FaunusCompiler: Executing job 1 out of 1:     MapSequence[com.thinkaurelius.faunus.mapreduce.transform.VerticesMap.Map,   com.thinkaurelius.faunus.mapreduce.sideeffect.ValueGroupCountMapReduce.Map, com.thinkaurelius.faunus.mapreduce.sideeffect.ValueGroupCountMapReduce.Reduce]
    13/09/29 21:38:49 INFO mapreduce.FaunusCompiler: Job data location: output/job-0
    13/09/29 21:38:49 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    13/09/29 21:38:50 INFO input.FileInputFormat: Total input paths to process : 1
    13/09/29 21:38:50 INFO mapred.JobClient: Cleaning up the staging area hdfs://hadoop121.ctd.com:8020/user/root/.staging/job_201309292136_0003
Compression codec com.hadoop.compression.lzo.LzoCodec not found.
Display stack trace? [yN] y
java.lang.RuntimeException: Compression codec com.hadoop.compression.lzo.LzoCodec not found.
        at com.thinkaurelius.faunus.tinkerpop.gremlin.ResultHookClosure.call(ResultHookClosure.java:54)
        at groovy.lang.Closure.call(Closure.java:428)
        at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSite.invoke(PogoMetaMethodSite.java:231)
        at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.call(PogoMetaMethodSite.java:64)
        at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:116)
        at org.codehaus.groovy.tools.shell.Groovysh.setLastResult(Groovysh.groovy:324)
        at org.codehaus.groovy.tools.shell.Groovysh.this$3$setLastResult(Groovysh.groovy)
        at sun.reflect.GeneratedMethodAccessor21.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
        at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:233)
        at groovy.lang.MetaClassImpl.setProperty(MetaClassImpl.java:2416)
        at groovy.lang.MetaClassImpl.setProperty(MetaClassImpl.java:3347)
        at org.codehaus.groovy.tools.shell.Shell.setProperty(Shell.groovy)
        at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.setGroovyObjectProperty(ScriptBytecodeAdapter.java:528)
        at org.codehaus.groovy.tools.shell.Groovysh.execute(Groovysh.groovy:152)
        at org.codehaus.groovy.tools.shell.Shell.leftShift(Shell.groovy:114)
        at org.codehaus.groovy.tools.shell.Shell$leftShift$0.call(Unknown Source)
        at org.codehaus.groovy.tools.shell.ShellRunner.work(ShellRunner.groovy:88)
        at org.codehaus.groovy.tools.shell.InteractiveShellRunner.super$2$work(InteractiveShellRunner.groovy)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
        at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:233)
        at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1079)
        at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuperN(ScriptBytecodeAdapter.java:128)
        at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuper0(ScriptBytecodeAdapter.java:148)
        at org.codehaus.groovy.tools.shell.InteractiveShellRunner.work(InteractiveShellRunner.groovy:100)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:272)
        at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:52)
        at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:137)
        at org.codehaus.groovy.tools.shell.ShellRunner.run(ShellRunner.groovy:57)
        at org.codehaus.groovy.tools.shell.InteractiveShellRunner.super$2$run(InteractiveShellRunner.groovy)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
        at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:233)
        at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1079)
        at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuperN(ScriptBytecodeAdapter.java:128)
        at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuper0(ScriptBytecodeAdapter.java:148)
        at org.codehaus.groovy.tools.shell.InteractiveShellRunner.run(InteractiveShellRunner.groovy:66)
        at com.thinkaurelius.faunus.tinkerpop.gremlin.Console.<init>(Console.java:54)
        at com.thinkaurelius.faunus.tinkerpop.gremlin.Console.<init>(Console.java:61)
        at com.thinkaurelius.faunus.tinkerpop.gremlin.Console.main(Console.java:66)
Caused by: java.lang.IllegalArgumentException: Compression codec com.hadoop.compression.lzo.LzoCodec not found.
        at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:96)
        at org.apache.hadoop.io.compress.CompressionCodecFactory.<init>(CompressionCodecFactory.java:134)
        at com.thinkaurelius.faunus.formats.graphson.GraphSONInputFormat.isSplitable(GraphSONInputFormat.java:33)
        at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:258)
        at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:1024)
        at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1041)
        at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:959)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:912)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:912)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:500)
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530)
        at com.thinkaurelius.faunus.mapreduce.FaunusCompiler.run(FaunusCompiler.java:322)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
        at com.thinkaurelius.faunus.FaunusPipeline.submit(FaunusPipeline.java:1075)
        at com.thinkaurelius.faunus.FaunusPipeline.submit(FaunusPipeline.java:1058)
        at com.thinkaurelius.faunus.tinkerpop.gremlin.ResultHookClosure.call(ResultHookClosure.java:38)
        ... 55 more
Caused by: java.lang.ClassNotFoundException: com.hadoop.compression.lzo.LzoCodec
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:247)
        at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:802)
        at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:89)
        ... 75 more
gremlin>g=FaunusFactory.open('bin/faunus.properties')
==>faunusgraph[图形输入格式->图形输出格式]
gremlin>g.V.type.groupCount
13/09/29 21:38:49警告mapreduce.Faunus编译器:使用Faunus job jar发行版:lib/Faunus-0.3.2-job.jar
13/09/29 21:38:49信息mapreduce.FaunusCompiler:编译为1个mapreduce作业
13/09/29 21:38:49信息mapreduce.faunus编译器:执行作业1/1:MapSequence[com.thinkaurelius.faunus.mapreduce.transform.VerticesMap.Map,com.thinkaurelius.faunus.mapreduce.sideeffect.ValueGroupCountMapReduce.Map,com.thinkaurelius.faunus.mapreduce.sideeffect.ValueGroupCountMapReduce]
13/09/29 21:38:49信息mapreduce.FaunusCompiler:作业数据位置:输出/作业-0
13/09/29 21:38:49 WARN mapred.JobClient:使用GenericOptionsParser解析参数。应用程序应该为相同的应用程序实现工具。
13/09/29 21:38:50信息输入。文件输入格式:要处理的总输入路径:1
13/09/29 21:38:50信息映射。作业客户端:清理临时区域hdfs://hadoop121.ctd.com:8020/user/root/.staging/job_201309292136_0003
找不到压缩编解码器com.hadoop.Compression.lzo.LzoCodec。
显示堆栈跟踪?[yN]y
java.lang.RuntimeException:找不到压缩编解码器com.hadoop.Compression.lzo.LzoCodec。
在com.thinkaurelius.faunus.tinkerpop.gremlin.ResultHookClosure.call上(ResultHookClosure.java:54)
调用groovy.lang.Closure.call(Closure.java:428)
位于sun.reflect.GeneratedMethodAccessor22.invoke(未知源)
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)中
位于java.lang.reflect.Method.invoke(Method.java:597)
位于org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSite.invoke(PogoMetaMethodSite.java:231)
位于org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.call(PogoMetaMethodSite.java:64)
位于org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:116)
位于org.codehaus.groovy.tools.shell.Groovysh.setLastResult(Groovysh.groovy:324)
在org.codehaus.groovy.tools.shell.Groovysh.this$3$setlastsresult(Groovysh.groovy)
位于sun.reflect.GeneratedMethodAccessor21.invoke(未知源)
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)中
位于java.lang.reflect.Method.invoke(Method.java:597)
位于org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
位于groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:233)
位于groovy.lang.MetaClassImpl.setProperty(MetaClassImpl.java:2416)
位于groovy.lang.MetaClassImpl.setProperty(MetaClassImpl.java:3347)
位于org.codehaus.groovy.tools.shell.shell.setProperty(shell.groovy)
位于org.codehaus.groovy.runtime.ScriptBytecodeAdapter.setGroovyObjectProperty(ScriptBytecodeAdapter.java:528)
位于org.codehaus.groovy.tools.shell.Groovysh.execute(Groovysh.groovy:152)
位于org.codehaus.groovy.tools.shell.shell.leftShift(shell.groovy:114)
位于org.codehaus.groovy.tools.shell.shell$leftShift$0.call(未知来源)
位于org.codehaus.groovy.tools.shell.ShellRunner.work(ShellRunner.groovy:88)
位于org.codehaus.groovy.tools.shell.InteractiveShellRunner.super$2$work(InteractiveShellRunner.groovy)
在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处
位于sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)中
位于java.lang.reflect.Method.invoke(Method.java:597)
位于org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
位于groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:233)
位于groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1079)
位于org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuperN(ScriptBytecodeAdapter.java:128)
位于org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuper0(ScriptBytecodeAdapter.java:148)
位于org.codehaus.groovy.tools.shell.InteractiveShellRunner.work(InteractiveShellRunner.groovy:100)
在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处
位于sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)中
位于java.lang.reflect.Method.invoke(Method.java:597)
位于org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$pogocachedmethodsiteNounwrapnocerc.invoke(PogoMetaMethodSite.java:272)
位于org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:52)
位于org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:137)
位于org.codehaus.groovy.tools.shell.ShellRunner.run(ShellRunner.groovy:57)
位于org.codehaus.groovy.tools.shell.InteractiveShellRunner.super$2$run(InteractiveShellRunner.groovy)
在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处
位于sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)中
位于java.lang.reflect.Method.invoke(Method.java:597)
位于org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
位于groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:233)
位于groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1079)
位于org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuperN(ScriptBytecodeAdapter.java:128)
位于org.codehaus.groovy.runtime.Script
<property>
<name>io.compression.codec.lzo.class</name>
<value>com.hadoop.compression.lzo.LzoCodec</value>
</property>
gremlin> g = FaunusFactory.open('bin/faunus.properties')
==>faunusgraph[graphsoninputformat->graphsonoutputformat]
gremlin> g.V                                            
13/09/30 21:57:51 WARN mapreduce.FaunusCompiler: Using the distribution Faunus job jar: lib/faunus-0.3.2-job.jar
13/09/30 21:57:51 INFO mapreduce.FaunusCompiler: Compiled to 1 MapReduce job(s)
13/09/30 21:57:51 INFO mapreduce.FaunusCompiler: Executing job 1 out of 1: MapSequence[com.thinkaurelius.faunus.mapreduce.transform.VerticesMap.Map]
13/09/30 21:57:51 INFO mapreduce.FaunusCompiler: Job data location: output/job-0
13/09/30 21:57:51 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
13/09/30 21:57:52 INFO input.FileInputFormat: Total input paths to process : 1
13/09/30 21:57:52 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
13/09/30 21:57:52 INFO lzo.LzoCodec: Successfully loaded & initialized native-lzo library [hadoop-lzo rev cf4e7cbf8ed0f0622504d008101c2729dc0c9ff3]
13/09/30 21:57:52 WARN snappy.LoadSnappy: Snappy native library is available
13/09/30 21:57:52 INFO util.NativeCodeLoader: Loaded the native-hadoop library
13/09/30 21:57:52 INFO snappy.LoadSnappy: Snappy native library loaded
13/09/30 21:57:53 INFO mapred.JobClient: Running job: job_201309302049_0010
13/09/30 21:57:54 INFO mapred.JobClient:  map 0% reduce 0%