Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/369.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
在hbase中,无法识别目录的fshdfs://test/apps/hbase/data/lib,忽略了java.io.IOException_Java_Maven_Hadoop_Jdbc_Hbase - Fatal编程技术网

在hbase中,无法识别目录的fshdfs://test/apps/hbase/data/lib,忽略了java.io.IOException

在hbase中,无法识别目录的fshdfs://test/apps/hbase/data/lib,忽略了java.io.IOException,java,maven,hadoop,jdbc,hbase,Java,Maven,Hadoop,Jdbc,Hbase,我能够连接到Hbase进行JAVA代码插入,它不会引发任何错误,但在构建maven依赖关系后,我会遇到以下错误: org.apache.hadoop.hbase.util.DynamicClassLoader-无法识别目录的fshdfs://test/apps/hbase/data/lib,忽略了java.io.IOException:scheme:hdfs没有文件系统 我正在使用这个jar,它是在spring的其他项目中使用maven创建的 在下面找到完整的日志 09:16:56,920 [h

我能够连接到Hbase进行JAVA代码插入,它不会引发任何错误,但在构建maven依赖关系后,我会遇到以下错误:

org.apache.hadoop.hbase.util.DynamicClassLoader-无法识别目录的fshdfs://test/apps/hbase/data/lib,忽略了java.io.IOException:scheme:hdfs没有文件系统

我正在使用这个jar,它是在spring的其他项目中使用maven创建的

在下面找到完整的日志

09:16:56,920 [http-nio-8080-exec-4] WARN  org.apache.hadoop.util.NativeCodeLoader  - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
        09:16:56,982 [http-nio-8080-exec-4] ERROR org.apache.hadoop.util.Shell  - Failed to locate the winutils binary in the hadoop binary path
        java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
                at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:355)
                at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:370)
                at org.apache.hadoop.util.Shell.<clinit>(Shell.java:363)
                at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:78)
                at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:93)
                at org.apache.hadoop.security.Groups.<init>(Groups.java:77)
                at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:240)
                at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:257)
                at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:234)
                at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:749)
                at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:734)...........................
.....at java.lang.Thread.run(Thread.java:748)
        09:16:58,733 [http-nio-8080-exec-4] WARN  org.apache.hadoop.hbase.util.DynamicClassLoader  - Failed to identify the fs of dir hdfs://test/apps/hbase/data/lib, ignored
        java.io.IOException: No FileSystem for scheme: hdfs
                at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2579)
                at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2586)
                at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
                at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2625)
                at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2607)
                at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
                at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
                at org.apache.hadoop.hbase.util.DynamicClassLoader.initTempDir(DynamicClassLoader.java:118)
                at org.apache.hadoop.hbase.util.DynamicClassLoader.<init>(DynamicClassLoader.java:98)
                at org.apache.hadoop.hbase.protobuf.ProtobufUtil.<clinit>(ProtobufUtil.java:241)
                at org.apache.hadoop.hbase.ClusterId.parseFrom(ClusterId.java:64)
                at org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:75)
                at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:105)
                at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.retrieveClusterId(ConnectionManager.java:879)
                at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:635)
                at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
                at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
                at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
                at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
                at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238)
                at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:218)
                at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:119)...................
我补充说

core-site.xml
hadoop-env.sh
hbase-env.sh
hbase-policy.xml
hdfs-site.xml
hbase-site.xml

如果在上面的java代码中添加所有这些东西,它工作得很好(没有任何错误),但是如果我将其构建为maven jar,那么它会给出上面提到的异常。我是否缺少maven或上述资源文件中的任何配置。

我添加了hadoop JAR,然后问题解决了

public class HbaseConnectionHolder {

    public static Connection connection=null;
    public static Configuration conf=null;
    public static Table table=null;
    static {
        System.out.println("------------HBaseConfiguration.create()");
        conf = HBaseConfiguration.create();
        System.out.println("------------configuration");
        conf.set("hbase.zookeeper.quorum", "<test1.cloud>:2080,<test2.cloud>:2181,<test3.cloud>:2181");
        conf.set("hbase.zookeeper.property.clientPort", "2080");
        conf.set("hbase.cluster.distributed", "true");
        conf.set("zookeeper.znode.parent","/hbase-unsecure");
        try {
            System.out.println("------------connection");
            connection = ConnectionFactory.createConnection(conf);
            System.out.println("------------table");
            table = connection.getTable(TableName.valueOf("test"));
        } catch (IOException e) {
            e.printStackTrace();
        }

    }

    public static Connection getHbaseConnection()
    {
        return connection;
    }

    public static Table getHbaseTableInstance()
    {
        return table;
    }

}
public void execute(DelegateExecution execution) {
        try {
            Put put = new Put(Bytes.toBytes("basic_id/123420"));

            put.add(Bytes.toBytes("det"), Bytes.toBytes("name"), Bytes.toBytes(""));
            HbaseConnectionHolder.getHbaseTableInstance().put(put);
        } catch (IOException e) {
            e.printStackTrace();
        } catch (ParseException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }

    }

    }
core-site.xml
hadoop-env.sh
hbase-env.sh
hbase-policy.xml
hdfs-site.xml
hbase-site.xml