Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/mysql/55.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java NoSuchMethodError:JobConf.getCredentials()_Java_Mysql_Hadoop_Hive_Sqoop - Fatal编程技术网

Java NoSuchMethodError:JobConf.getCredentials()

Java NoSuchMethodError:JobConf.getCredentials(),java,mysql,hadoop,hive,sqoop,Java,Mysql,Hadoop,Hive,Sqoop,我正在尝试将表从MySQL导入到Hive。我测试了我与MySQL的连接是否正确。现在所有的表数据也包装在jar中。然后在我得到这个错误之后。 我在Windows 8上运行MySQL,Hadoop Hive在VirtualBox中作为Horton Sandbox运行 org.apache.hadoop.mapred.JobConf.getCredentials()的NoSuchMethodError 错误: INFO: Initializing JVM Metrics with process

我正在尝试将表从MySQL导入到Hive。我测试了我与MySQL的连接是否正确。现在所有的表数据也包装在jar中。然后在我得到这个错误之后。 我在Windows 8上运行MySQL,Hadoop Hive在VirtualBox中作为Horton Sandbox运行

org.apache.hadoop.mapred.JobConf.getCredentials()的NoSuchMethodError

错误:

  INFO: Initializing JVM Metrics with processName=JobTracker, sessionId=
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.mapred.JobConf.getCredentials()Lorg/apache/hadoop/security/Credentials;
    at org.apache.sqoop.mapreduce.db.DBConfiguration.setPassword(DBConfiguration.java:158)
    at org.apache.sqoop.mapreduce.db.DBConfiguration.configureDB(DBConfiguration.java:144)
    at org.apache.sqoop.mapreduce.DataDrivenImportJob.configureInputFormat(DataDrivenImportJob.java:171)
    at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:231)
    at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:600)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:413)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:502)
    at SqoopJavaInterface.importToHive(SqoopJavaInterface.java:66)
    at SqoopJavaInterface.main(SqoopJavaInterface.java:32)
这是我的源代码:-

import java.io.IOException;    
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.sqoop.tool.ImportTool;

import com.cloudera.sqoop.SqoopOptions;

public class SqoopJavaInterface {
private static final String JOB_NAME = "Sqoop Hive Job";
private static final String MAPREDUCE_JOB = "Hive Map Reduce Job";
private static final String DBURL = "jdbc:mysql://localhost:3316/db";
private static final String DRIVER = "com.mysql.jdbc.Driver";
private static final String USERNAME = "user";
private static final String PASSWORD = "password";
private static final String HADOOP_HOME = "/home/master/apps/hadoop-1.0.4";


private static final String JAR_OUTPUT_DIR = "/home/master/data";
private static final String HIVE_HOME = "/home/master/apps/hive-0.10.0";
private static final String HIVE_DIR = "/user/hive/warehouse/";
private static final String WAREHOUSE_DIR = "hdfs://localhost:9000/user/hive/warehouse/student";
private static final String SUCCESS = "SUCCESS !!!";
private static final String FAIL = "FAIL !!!";

/**
 * @param table
 * @throws IOException
 */
public static void main(String args[]) throws IOException{
    importToHive("some_table");
}
public static void importToHive(String table) throws IOException {
    System.out.println("SqoopOptions loading .....");
    Configuration config = new Configuration();
    // Hive connection parameters
    config.addResource(new Path(HADOOP_HOME+"/conf/core-site.xml"));
    config.addResource(new Path(HADOOP_HOME+"/conf/hdfs-site.xml"));
    config.addResource(new Path(HIVE_HOME+"/conf/hive-site.xml"));
    FileSystem dfs =FileSystem.get(config);
    /* MySQL connection parameters */
    SqoopOptions options = new SqoopOptions(config);
    options.setConnectString(DBURL);
    options.setTableName(table);
    options.setDriverClassName(DRIVER);
    options.setUsername(USERNAME);
    options.setPassword(PASSWORD);
    options.setHadoopMapRedHome(HADOOP_HOME);
    options.setHiveHome(HIVE_HOME);
    options.setHiveImport(true);
    options.setHiveTableName(table);
    options.setOverwriteHiveTable(true);
    options.setFailIfHiveTableExists(false);
    options.setFieldsTerminatedBy(',');
    options.setOverwriteHiveTable(true);
    options.setDirectMode(true);
    options.setNumMappers(1); // No. of Mappers to be launched for the job
    options.setWarehouseDir(WAREHOUSE_DIR);
    options.setJobName(JOB_NAME);
    options.setMapreduceJobName(MAPREDUCE_JOB);
    options.setTableName(table);
    options.setJarOutputDir(JAR_OUTPUT_DIR);
    System.out.println("Import Tool running ....");
    ImportTool it = new ImportTool();
    int retVal = it.run(options);
    if (retVal == 0) {
        System.out.println(SUCCESS);
    } else {
        System.out.println(FAIL);
    }
}
}

幸运的是,我自己找到了问题的答案。我必须在构建路径中包含一个jar文件,而不是hadoop-0.20.2-core.jar。它看起来像是同一个文件的修改版本,该文件包含包含getCredentials()方法的JobConf类

问题解决了,但我仍然对这两个版本感到困惑?有人知道实际的区别吗