Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/mysql/65.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Mysql Hive-1.1.0在将数据插入创建的表时显示错误(使用hadoop-2.5.1)_Mysql_Hadoop_Hive - Fatal编程技术网

Mysql Hive-1.1.0在将数据插入创建的表时显示错误(使用hadoop-2.5.1)

Mysql Hive-1.1.0在将数据插入创建的表时显示错误(使用hadoop-2.5.1),mysql,hadoop,hive,Mysql,Hadoop,Hive,我使用以下命令成功创建了表: CREATE TABLE movie_example (title STRING, id BIGINT, director STRING, year BIGINT, genres ARRAY<STRING>) ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' COLLECTION ITEMS TERMINATED BY '$' MAP KEYS TERMI

我使用以下命令成功创建了表:

CREATE  TABLE movie_example 
   (title STRING, id BIGINT, director STRING, 
    year BIGINT, genres ARRAY<STRING>) 
    ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' 
        COLLECTION ITEMS TERMINATED BY '$' MAP KEYS 
    TERMINATED BY '#' LINES 
   TERMINATED BY '\n' STORED AS TEXTFILE;
LOAD DATA LOCAL INPATH '/<path>/hiveExample.txt' 
   OVERWRITE INTO TABLE movie_example;
创建表电影\u示例
(标题字符串、id BIGINT、控制器字符串、,
年份BIGINT,类型数组)
以“,”结尾的行格式分隔字段
集合项以“$”映射键终止
以“#”行结尾
由存储为TEXTFILE的“\n”终止;
当我尝试使用以下命令向该表插入数据时:

CREATE  TABLE movie_example 
   (title STRING, id BIGINT, director STRING, 
    year BIGINT, genres ARRAY<STRING>) 
    ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' 
        COLLECTION ITEMS TERMINATED BY '$' MAP KEYS 
    TERMINATED BY '#' LINES 
   TERMINATED BY '\n' STORED AS TEXTFILE;
LOAD DATA LOCAL INPATH '/<path>/hiveExample.txt' 
   OVERWRITE INTO TABLE movie_example;
LOAD DATA LOCAL INPATH'//hiveExample.txt'
覆盖到表中,例如;
这是一个错误:

 java.lang.NoSuchMethodError: org.apache.hadoop.hdfs.DFSClient.getKeyProvider()Lorg/apache/hadoop/crypto/key/KeyProvider;
    at org.apache.hadoop.hive.shims.Hadoop23Shims$HdfsEncryptionShim.<init>(Hadoop23Shims.java:1152)
    at org.apache.hadoop.hive.shims.Hadoop23Shims.createHdfsEncryptionShim(Hadoop23Shims.java:1279)
    at org.apache.hadoop.hive.ql.session.SessionState.getHdfsEncryptionShim(SessionState.java:392)
    at org.apache.hadoop.hive.ql.metadata.Hive.moveFile(Hive.java:2418)
    at org.apache.hadoop.hive.ql.metadata.Hive.replaceFiles(Hive.java:2747)
    at org.apache.hadoop.hive.ql.metadata.Table.replaceFiles(Table.java:640)
    at org.apache.hadoop.hive.ql.metadata.Hive.loadTable(Hive.java:1582)
    at org.apache.hadoop.hive.ql.exec.MoveTask.execute(MoveTask.java:297)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
java.lang.NoSuchMethodError:org.apache.hadoop.hdfs.DFSClient.getKeyProvider()Lorg/apache/hadoop/crypto/key/KeyProvider;
在org.apache.hadoop.hive.shimes.hadoop23shimes$HdfsEncryptionShim.(hadoop23shimes.java:1152)
在org.apache.hadoop.hive.shimes.hadoop23shimes.createHdfsEncryptionShim(hadoop23shimes.java:1279)上
位于org.apache.hadoop.hive.ql.session.SessionState.getHdfsEncryptionShim(SessionState.java:392)
位于org.apache.hadoop.hive.ql.metadata.hive.moveFile(hive.java:2418)
位于org.apache.hadoop.hive.ql.metadata.hive.replaceFiles(hive.java:2747)
位于org.apache.hadoop.hive.ql.metadata.Table.replaceFiles(Table.java:640)
位于org.apache.hadoop.hive.ql.metadata.hive.loadTable(hive.java:1582)
位于org.apache.hadoop.hive.ql.exec.MoveTask.execute(MoveTask.java:297)
位于org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)

当我搜索时,这似乎是一个bug


有没有解决此错误的方法?将Hadoop升级到2.6应该是一个解决方法。请跟着jira走