Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/323.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
我的hadoop2.6.5 HA,从mysql导入数据时,运行错误java.lang.StackOverflowerError_Java_Mysql_Hadoop - Fatal编程技术网

我的hadoop2.6.5 HA,从mysql导入数据时,运行错误java.lang.StackOverflowerError

我的hadoop2.6.5 HA,从mysql导入数据时,运行错误java.lang.StackOverflowerError,java,mysql,hadoop,Java,Mysql,Hadoop,在我的Hadoop2.6.5HA中,当我使用SQOOP1.4.6从mysql数据导入数据时,我得到以下错误 OS:Centos6.5比特 有人面临同样的问题吗? 如果我错过了什么地方 [hadoop@dns app]$ sqoop import --connect jdbc:mysql://localhost:3306/hive --username hive --password hive --table DBS --m 1 --target-dir /user/test3 Warning:

在我的Hadoop2.6.5HA中,当我使用SQOOP1.4.6从mysql数据导入数据时,我得到以下错误

OS:Centos6.5比特

有人面临同样的问题吗? 如果我错过了什么地方

[hadoop@dns app]$ sqoop import --connect jdbc:mysql://localhost:3306/hive --username hive --password hive --table DBS --m 1 --target-dir /user/test3
Warning: /home/hadoop/app/sqoop/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /home/hadoop/app/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
17/09/01 16:54:11 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
17/09/01 16:54:11 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/09/01 16:54:11 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
17/09/01 16:54:11 INFO tool.CodeGenTool: Beginning code generation
17/09/01 16:54:11 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `DBS` AS t LIMIT 1
17/09/01 16:54:11 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `DBS` AS t LIMIT 1
17/09/01 16:54:11 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/hadoop/app/hadoop
Note: /tmp/sqoop-hadoop/compile/53650552d9b1969139bf57841c0c9aa1/DBS.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/09/01 16:54:14 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/53650552d9b1969139bf57841c0c9aa1/DBS.jar
17/09/01 16:54:14 WARN manager.MySQLManager: It looks like you are importing from mysql.
17/09/01 16:54:14 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
17/09/01 16:54:14 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
17/09/01 16:54:14 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
17/09/01 16:54:14 INFO mapreduce.ImportJobBase: Beginning import of DBS
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/app/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/app/hbase/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
17/09/01 16:54:14 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/09/01 16:54:14 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
17/09/01 16:54:14 WARN fs.FileSystem: "cluster1" is a deprecated filesystem name. Use "hdfs://cluster1/" instead.
17/09/01 16:54:20 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
17/09/01 16:54:20 WARN fs.FileSystem: "cluster1" is a deprecated filesystem name. Use "hdfs://cluster1/" instead.
Exception in thread "main" java.lang.StackOverflowError
        at org.apache.commons.collections.map.AbstractMapDecorator.containsKey(AbstractMapDecorator.java:83)
        at org.apache.hadoop.conf.Configuration.isDeprecated(Configuration.java:558)
        at org.apache.hadoop.conf.Configuration.handleDeprecation(Configuration.java:605)
        at org.apache.hadoop.conf.Configuration.get(Configuration.java:1185)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
[hadoop@dns app]$ 

之所以出现
stackOverflowerError
,是因为java调用堆栈已经增长,并且运行JVM中没有更多内存来容纳更多的调用堆栈细节

请仔细查看Java错误堆栈并注意 -有以下两个功能

public static FileContext getFileContext(final URI defaultFsUri, final Configuration aConf)
public static FileContext getFileContext(final Configuration aConf)
在某些情况下,第一个getFileContext调用第二个getFileContext,第二个getFileContext调用第一个getFileContext。这意味着它的递归调用


在您的情况下,
core site.xml
必须包含值为
cluster1
的属性
fs.defaultFS
。相反,它应该是
hdfs://cluster1
。如果将
fs.defaultFS
设置为正确的URL
hdfs://cluster1

在您的示例中,core-site.xml必须包含值为cluster1的属性fs.defaultFS。应该是这样hdfs://cluster1. 如果将fs.defaultFS设置为正确的URL,递归调用将消失hdfs://cluster1.

谢谢,当我用value更改fs.defaultFS时hdfs://cluster1 ,问题已经解决了