Warning: file_get_contents(/data/phpspider/zhask/data//catemap/1/hibernate/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
hbase snappy安装问题_Hbase_Snappy - Fatal编程技术网

hbase snappy安装问题

hbase snappy安装问题,hbase,snappy,Hbase,Snappy,在hadoop/hbase集群中设置Snappy时遇到以下问题。我已经将libnappy.so和libhadoop.so复制到$HBASE\u HOME\u DIR/lib/native/Linux-amd64-64。知道这里出了什么问题吗 Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.compress.snappy.SnappyCompressor.compressBytesDir

在hadoop/hbase集群中设置Snappy时遇到以下问题。我已经将libnappy.so和libhadoop.so复制到$HBASE\u HOME\u DIR/lib/native/Linux-amd64-64。知道这里出了什么问题吗

Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.compress.snappy.SnappyCompressor.compressBytesDirect()I
        at org.apache.hadoop.io.compress.snappy.SnappyCompressor.compressBytesDirect(Native Method)
        at org.apache.hadoop.io.compress.snappy.SnappyCompressor.compress(SnappyCompressor.java:229)
        at org.apache.hadoop.io.compress.BlockCompressorStream.compress(BlockCompressorStream.java:141)
        at org.apache.hadoop.io.compress.BlockCompressorStream.finish(BlockCompressorStream.java:135)
        at org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.version20compression(HFileBlock.java:878)
        at org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.doCompressionAndChecksumming(HFileBlock.java:858)
        at org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.finishBlock(HFileBlock.java:845)
        at org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.ensureBlockReady(HFileBlock.java:823)
        at org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.writeHeaderAndData(HFileBlock.java:1060)
        at org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.writeHeaderAndData(HFileBlock.java:1047)
        at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishBlock(HFileWriterV2.java:191)
        at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.close(HFileWriterV2.java:372)
        at org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:114)
        at org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:138)
13/07/30 17:16:10 ERROR hdfs.DFSClient: Exception closing file /hbase/to/test.txt : org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException: No lease on /hbase/to/test.txt File does not exist. Holder DFSClient_931166160 does not have any open files
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkLease(FSNamesystem.java:1999)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkLease(FSNamesystem.java:1990)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.completeFileInternal(FSNamesystem.java:2045)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.completeFile(FSNamesystem.java:2033)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.complete(NameNode.java:805)
        at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:587)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1432)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1428)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1426)

看起来有一个快速版本不匹配。你从哪里得到libsnapy.so和libhadoop.so?我用make构建了snapy。以及hadoop1.2发行版中的libhadoop.so。您确定您的系统中没有其他(可能是旧版本)libsnapy.so吗?