Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/maven/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
适用于hadoop核心的maven artifactId hadoop 2.2.0_Maven_Hadoop_Ant_Hadoop2 - Fatal编程技术网

适用于hadoop核心的maven artifactId hadoop 2.2.0

适用于hadoop核心的maven artifactId hadoop 2.2.0,maven,hadoop,ant,hadoop2,Maven,Hadoop,Ant,Hadoop2,我正在将我的应用程序从hadoop 1.0.3迁移到hadoop 2.2.0,maven build将hadoop核心标记为依赖项。因为hadoop 2.2.0没有hadoop核心。我尝试用hadoop client和hadoop common替换它,但是我仍然在ant.filter中遇到这个错误。有人能建议使用哪一件文物吗 previous config : <dependency> <groupId>org.apache.hadoop</groupId&

我正在将我的应用程序从hadoop 1.0.3迁移到hadoop 2.2.0,maven build将hadoop核心标记为依赖项。因为hadoop 2.2.0没有hadoop核心。我尝试用hadoop client和hadoop common替换它,但是我仍然在ant.filter中遇到这个错误。有人能建议使用哪一件文物吗

previous config :
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-core</artifactId>
    <version>1.0.3</version>
</dependency>

New Config:

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-client</artifactId>
    <version>2.2.0</version>
</dependency>

尝试这些工件,在我的示例项目中使用word fine


org.apache.hadoop
hadoop通用
2.2.0
org.apache.hadoop
hadoop内核
1.2.1

我们的应用程序主要依赖hdfs api。当我们迁移到Hadoop2.X时,我们惊讶地看到依赖项的变化。我们开始一次添加一个依赖项。今天,我们依赖以下核心库

hadoop-annotations-2.2.0
hadoop-auth-2.2.0
hadoop-common-2.2.0
hadoop-hdfs-2.2.0
hadoop-mapreduce-client-core-2.2.0

除此之外,我们还依赖于测试库。根据您的需要,您可能希望将hadoop hdfs和hadoop mapreduce client与hadoop common一起包含到依赖项中。

Maven依赖项可以从链接中获得。 就HadoopCore的依赖性而言,HadoopCore是Hadoop1.X的名称,仅仅将版本重命名为2.X是没有帮助的。另外,在Hadoop2.X项目中,使用Hadoop1.X依赖项会出现如下错误

原因:org.apache.hadoop.ipc.RemoteException:服务器ipc版本9无法与客户端版本4通信

因此,建议不要使用它。我在hadoop中使用了以下依赖项

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-client</artifactId>
    <version>2.7.1</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-hdfs</artifactId>
    <version>2.7.1</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-core</artifactId>
    <version>2.7.1</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-jobclient</artifactId>
    <version>2.7.1</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-common</artifactId>
    <version>2.7.1</version>
</dependency>
<dependency>

org.apache.hadoop
hadoop客户端
2.7.1
org.apache.hadoop
hadoop hdfs
2.7.1
org.apache.hadoop
hadoop mapreduce客户端核心
2.7.1
org.apache.hadoop
hadoop mapreduce客户端jobclient
2.7.1
org.apache.hadoop
hadoop mapreduce客户端公用程序
2.7.1

您可以试试这些。

谢谢,但是hadoop核心中的1.2.1版会工作吗?我正在尝试迁移到2.2.0。在我的问题中,在新的配置部分也有一个旧的工件,我现在已经更正了它。hadoop-core和hadoop-common不遵循相同的版本号。
hadoop-annotations-2.2.0
hadoop-auth-2.2.0
hadoop-common-2.2.0
hadoop-hdfs-2.2.0
hadoop-mapreduce-client-core-2.2.0
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-client</artifactId>
    <version>2.7.1</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-hdfs</artifactId>
    <version>2.7.1</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-core</artifactId>
    <version>2.7.1</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-jobclient</artifactId>
    <version>2.7.1</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-common</artifactId>
    <version>2.7.1</version>
</dependency>
<dependency>