Hadoop 带mapreduce的hcatalog
执行MapReduce程序时出现以下错误。 我已经将所有JAR放在hadoop/lib目录中,并且还提到了-libjars中的JAR 这是我正在执行的命令:Hadoop 带mapreduce的hcatalog,hadoop,mapreduce,hive,hcatalog,Hadoop,Mapreduce,Hive,Hcatalog,执行MapReduce程序时出现以下错误。 我已经将所有JAR放在hadoop/lib目录中,并且还提到了-libjars中的JAR 这是我正在执行的命令: $HADOOP_HOME/bin/hadoop --config $HADOOP_HOME/conf jar /home/shash/distinct.jar HwordCount -libjars $LIB_JARS WordCount HWordCount2 java.lang.RuntimeException: java.lang
$HADOOP_HOME/bin/hadoop --config $HADOOP_HOME/conf jar /home/shash/distinct.jar HwordCount -libjars $LIB_JARS WordCount HWordCount2
java.lang.RuntimeException: java.lang.ClassNotFoundException:
org.apache.hcatalog.mapreduce.HCatOutputFormat at
org.apache.hadoop.conf.Configuration.getClass(Configuration.java:996) at
org.apache.hadoop.mapreduce.JobContext.getOutputFormatClass(JobContext.java:248) at org.apache.hadoop.mapred.Task.initialize(Task.java:501) at
org.apache.hadoop.mapred.MapTask.run(MapTask.java:306) at org.apache.hadoop.mapred.Child$4.run(Child.java:270) at
java.security.AccessController.doPrivileged(Native Method) at
javax.security.auth.Subject.doAs(Subject.java:415) at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127) at
org.apache.hadoop.mapred.Child.main(Child.java:264) Caused by: java.lang.ClassNotFoundException: org.apache.hcatalog.mapreduce.HCatOutputFormat
at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at
java.net.URLClassLoader$1.run(URLClassLoader.java:355) at
java.security.AccessController.doPrivileged(Native Method) at
java.net.URLClassLoader.findClass(URLClassLoader.java:354) at
java.lang.ClassLoader.loadClass(ClassLoader.java:423) at
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at
java.lang.ClassLoader.loadClass(ClassLoader.java:356) at
java.lang.Class.forName0(Native Method) at
java.lang.Class.forName(Class.java:264) at
org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:943)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:994) ...
8 more
确保LIB_JARS是以逗号分隔的列表(而不是像CLASSPATH那样以冒号分隔) 适用于CDH 5.0.x CDH 5.1.x CDH 5.2.x CDH 5.3.x Sqoop 原因Sqoop无法获取HCatalog库,因为Cloudera Manager不设置配置单元主环境。它需要设置 手动 通过以下JIRA跟踪此问题: 自5.4.0版起,该JIRA的修复程序已包含在CDH中 解决方法:适用于低于5.4.0的CDH版本 在调用Sqoop命令或将它们添加到/etc/Sqoop/conf/Sqoop-env.sh之前,在shell中执行以下命令(如果它不存在,请创建一个):
export HIVE_HOME=/opt/cloudera/parcels/CDH/lib/hive (for parcel installation)
export HIVE_HOME=/usr/lib/hive (for package installation)