Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Hadoop 配置单元-从表中选择列时显示空指针异常_Hadoop_Hive - Fatal编程技术网

Hadoop 配置单元-从表中选择列时显示空指针异常

Hadoop 配置单元-从表中选择列时显示空指针异常,hadoop,hive,Hadoop,Hive,我的表在s3 bucket中,它有6列,当我简单地选择一列时,它会给出一个空指针异常 Select user from network_log limit 5; 它给出了以下错误: Error during job, obtaining debugging information... Job Tracking URL: http://ip-10-0-10-16.ap-southeast-1.compute.internal:50030/jobdetails.jsp?jobid=job_201

我的表在s3 bucket中,它有6列,当我简单地选择一列时,它会给出一个空指针异常

Select user from network_log limit 5;
它给出了以下错误:

Error during job, obtaining debugging information...
Job Tracking URL: http://ip-10-0-10-16.ap-southeast-1.compute.internal:50030/jobdetails.jsp?jobid=job_201407140633_22716
Examining task ID: task_201407140633_22716_m_000094 (and more) from job job_201407140633_22716
Examining task ID: task_201407140633_22716_m_000050 (and more) from job job_201407140633_22716
Examining task ID: task_201407140633_22716_m_000034 (and more) from job job_201407140633_22716
Examining task ID: task_201407140633_22716_m_000000 (and more) from job job_201407140633_22716
Examining task ID: task_201407140633_22716_m_000012 (and more) from job job_201407140633_22716
Examining task ID: task_201407140633_22716_m_000023 (and more) from job job_201407140633_22716
Examining task ID: task_201407140633_22716_m_000038 (and more) from job job_201407140633_22716
Examining task ID: task_201407140633_22716_m_000047 (and more) from job job_201407140633_22716
Examining task ID: task_201407140633_22716_m_000052 (and more) from job job_201407140633_22716
Examining task ID: task_201407140633_22716_m_000063 (and more) from job job_201407140633_22716
失败次数最多的任务(4): 任务ID: 任务\u 201407140633\u 22716\u m\u 000034

网址:

此任务的诊断消息:

java.lang.NullPointerException
        at org.apache.hadoop.fs.s3native.NativeS3FileSystem$NativeS3FsInputStream.close(NativeS3FileSystem.java:147)
        at java.io.BufferedInputStream.close(BufferedInputStream.java:451)
        at java.io.FilterInputStream.close(FilterInputStream.java:155)
        at org.apache.hadoop.io.IOUtils.cleanup(IOUtils.java:237)
        at org.apache.hadoop.io.IOUtils.closeStream(IOUtils.java:254)
        at org.apache.hadoop.hive.ql.io.RCFile$Reader.close(RCFile.java:1754)
        at org.apache.hadoop.hive.ql.io.RCFileRecordReader.close(RCFileRecordReader.java:145)
        at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.doClose(CombineHiveRecordReader.java:72)
        at org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.close(HiveContextAwareRecordReader.java:96)
        at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:344)
        at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.next(HadoopShimsSecure.java:251)
        at org.apache.hadoop.mapred.MapTas

FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTask