Hadoop 尝试连接时发生Sqoop错误

Hadoop 尝试连接时发生Sqoop错误,hadoop,sqoop,Hadoop,Sqoop,我正在尝试运行以下Sqoop命令: sqoop import --connect jdbc:mysql://localhost:3306/sunil_sqoop --table sqoop_emp --username root --password 225dvrdlr) 但是,我遇到了以下错误: 17/02/04 00:04:53警告security.UserGroupInformation:PriviledgedActionException as:avinash(auth:SIMPLE

我正在尝试运行以下Sqoop命令:

sqoop import --connect jdbc:mysql://localhost:3306/sunil_sqoop --table sqoop_emp --username root  --password 225dvrdlr)
但是,我遇到了以下错误:

17/02/04 00:04:53警告security.UserGroupInformation:PriviledgedActionException as:avinash(auth:SIMPLE)原因:java.io.FileNotFoundException:文件不存在:hdfs://localhost:9000/home/avinash/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/lib/slf4j-api-1.6.1.jar 17/02/04 00:04:53错误工具。ImportTool:运行导入作业时遇到IOException:java.io.FileNotFoundException:文件不存在:hdfs://localhost:9000/home/avinash/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/lib/slf4j-api-1.6.1.jar 位于org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1093) 位于org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1085) 位于org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) 位于org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1085) 位于org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288) 位于org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224) 位于org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93) 位于org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57) 位于org.apache.hadoop.mapreduce.jobsmitter.copyAndConfigureFiles(jobsmitter.java:267) 位于org.apache.hadoop.mapreduce.jobsmitter.copyAndConfigureFiles(jobsmitter.java:388) 位于org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:481) 位于org.apache.hadoop.mapreduce.Job$10.run(Job.java:1295) 位于org.apache.hadoop.mapreduce.Job$10.run(Job.java:1292) 位于java.security.AccessController.doPrivileged(本机方法) 位于javax.security.auth.Subject.doAs(Subject.java:415) 位于org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642) 位于org.apache.hadoop.mapreduce.Job.submit(Job.java:1292) 位于org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1313) 位于org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196) 位于org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169) 位于org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266) 位于org.apache.sqoop.manager.SqlManager.importable(SqlManager.java:673) 位于org.apache.sqoop.manager.MySQLManager.importable(MySQLManager.java:118) 位于org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497) 位于org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605) 位于org.apache.sqoop.sqoop.run(sqoop.java:143) 位于org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) 位于org.apache.sqoop.sqoop.runSqoop(sqoop.java:179) 位于org.apache.sqoop.sqoop.runTool(sqoop.java:218) 位于org.apache.sqoop.sqoop.runTool(sqoop.java:227) 位于org.apache.sqoop.sqoop.main(sqoop.java:236)

我该怎么办。

错误:

File does not exist: hdfs://localhost:9000/home/avinash/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/lib/slf4j-api-1.6.1.jar 
您应该将文件slf4j-api-1.6.1.jar复制到HDFS中的目录:

home/avinash/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/lib/. 
或者你可以把这个罐子复制到Oozie sharelib