Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
MongoDB hadoop连接器无法查询mongo配置单元表_Mongodb_Hadoop_Hive_Mongodb Hadoop_Bigdata - Fatal编程技术网

MongoDB hadoop连接器无法查询mongo配置单元表

MongoDB hadoop连接器无法查询mongo配置单元表,mongodb,hadoop,hive,mongodb-hadoop,bigdata,Mongodb,Hadoop,Hive,Mongodb Hadoop,Bigdata,我正在使用MongoDB hadoop连接器在hadoop中使用配置单元表查询MongoDB 我能执行 select * from mongoDBTestHiveTable; 但当我尝试执行以下查询时 select id from mongoDBTestHiveTable; 它抛出以下异常 配置单元库文件夹中存在以下类 异常堆栈跟踪: Diagnostic Messages for this Task: Error: java.io.IOException: Cannot creat

我正在使用MongoDB hadoop连接器在hadoop中使用配置单元表查询MongoDB

我能执行

select * from mongoDBTestHiveTable;
但当我尝试执行以下查询时

select id from mongoDBTestHiveTable;
它抛出以下异常

配置单元库文件夹中存在以下类

异常堆栈跟踪:

    Diagnostic Messages for this Task:
Error: java.io.IOException: Cannot create an instance of InputSplit class = com.mongodb.hadoop.hive.input.HiveMongoInputFormat$MongoHiveInputSplit:Class com.mongodb.hadoop.hive.input.HiveMongoInputFormat$MongoHiveInputSplit not found
    at org.apache.hadoop.hive.ql.io.HiveInputFormat$HiveInputSplit.readFields(HiveInputFormat.java:147)
    at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:71)
    at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:42)
    at org.apache.hadoop.mapred.MapTask.getSplitDetails(MapTask.java:370)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:402)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
Caused by: java.lang.ClassNotFoundException: Class com.mongodb.hadoop.hive.input.HiveMongoInputFormat$MongoHiveInputSplit not found
    at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1626)
    at org.apache.hadoop.hive.ql.io.HiveInputFormat$HiveInputSplit.readFields(HiveInputFormat.java:144)
    ... 10 more

Container killed by the ApplicationMaster.

请给出建议。

您还需要将mongo hadoop-*和mongo驱动程序JAR添加到所有Worker上的MR1/MR2类路径中

是的,这些JAR存在于hadoop安装目录和hive类路径中。我已将1.3.0 JAR用于hadoop 2.2,并且工作正常。你找到这个问题的原因了吗?