Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
使用JavaAPI在Hadoop中上载文件_Java_Hadoop_Hdfs - Fatal编程技术网

使用JavaAPI在Hadoop中上载文件

使用JavaAPI在Hadoop中上载文件,java,hadoop,hdfs,Java,Hadoop,Hdfs,无法通过java连接到HDFS import java.io.IOException; import java.net.URI; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.Path; public class App { public static void main( String[] args )

无法通过java连接到HDFS

import java.io.IOException;
import java.net.URI;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;

public class App 
{
    public static void main( String[] args ) throws IOException
    {
        System.out.println( "Hello World!" );
        System.out.println("---143---");
        String localPath="/home/user1/Documents/hdfspract.txt";
        String uri="hdfs://172.16.32.139:9000";
        String hdfsDir="hdfs://172.16.32.139:9000/fifo_tbl";

        Configuration conf = new Configuration();
        FileSystem fs = FileSystem.get(URI.create(uri),conf);

        fs.copyFromLocalFile(new Path(localPath),new  Path(hdfsDir));
    }
}

当我试图执行上述代码时,它给了我以下错误:

WARN util.NativeCodeLoader:无法加载的本机hadoop库 你的平台。。。在适用的情况下使用内置java类 线程“main”org.apache.hadoop.fs.UnsupportedFileSystemException中出现异常:没有用于的文件系统 “hdfs”方案 位于org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3332) 位于org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3352) 位于org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:124) 位于com.Jambo.App.main(App.java:21)

任何使用JavaAPI在Hadoop中上传文件的其他方式都将不胜感激,谢谢

<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>3.1.0</version>
</dependency>

<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-yarn-common</artifactId>
<version>2.9.0</version>
</dependency>

<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-common</artifactId>
<version>2.9.0</version>
</dependency>

<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>2.9.0</version>
</dependency>

<dependency>
<groupId>jdk.tools</groupId>
<artifactId>jdk.tools</artifactId>
<version>1.8.0_161</version>
<scope>system</scope>
<systemPath>/usr/local/jdk1.8.0_161/lib/tools.jar</systemPath>
</dependency>

<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.2.1</version>
</dependency>

org.apache.hadoop
hadoop通用
3.1.0
org.apache.hadoop
hadoop普通纱线
2.9.0
org.apache.hadoop
hadoop mapreduce客户端公用程序
2.9.0
org.apache.hadoop
hadoop mapreduce客户端核心
2.9.0
jdk.tools
jdk.tools
1.8.0_161
系统
/usr/local/jdk1.8.0_161/lib/tools.jar
org.apache.hadoop
hadoop内核
1.2.1

我强烈建议您编辑问题,将依赖项包含在链接图像中。这使人们更容易阅读,使你的问题更有可能得到回答。它还改进了上述问题中包含的JamesJones maven依赖项。