Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/323.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 在hdfs上运行从Amplab shark到cassandra的查询_Java_Hadoop_Cassandra_Apache Spark_Shark Sql - Fatal编程技术网

Java 在hdfs上运行从Amplab shark到cassandra的查询

Java 在hdfs上运行从Amplab shark到cassandra的查询,java,hadoop,cassandra,apache-spark,shark-sql,Java,Hadoop,Cassandra,Apache Spark,Shark Sql,请帮助Amplab Shark查询hdfs中的cassandra Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/task/JobContextImpl 我可以成功运行: 使用数据库 展示表格 等等 但不能运行任何select语句,即: 从表中选择* 我得到以下错误: shark> select * from call limit 1; Exception i

请帮助Amplab Shark查询hdfs中的cassandra

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/task/JobContextImpl
我可以成功运行:

  • 使用数据库
  • 展示表格
  • 等等
但不能运行任何select语句,即:

从表中选择*

我得到以下错误:

shark> select * from call limit 1;
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce    /task/JobContextImpl
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:270)
    at org.apache.hadoop.hive.ql.metadata.Table.getInputFormatClass(Table.java:302)
    at org.apache.hadoop.hive.ql.metadata.Table.<init>(Table.java:99)
    at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:989)
    at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:892)
    at     org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1083)
    at     org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1059)
    at     shark.parse.SharkSemanticAnalyzer.analyzeInternal(SharkSemanticAnalyzer.scala:139)
    at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:279)
    at shark.SharkDriver.compile(SharkDriver.scala:210)
    at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:909)
    at shark.SharkCliDriver.processCmd(SharkCliDriver.scala:328)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
    at shark.SharkCliDriver$.main(SharkCliDriver.scala:233)
    at shark.SharkCliDriver.main(SharkCliDriver.scala)
 Caused by: java.lang.ClassNotFoundException:  org.apache.hadoop.mapreduce.task.JobContextImpl
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    ... 17 more
shark>从通话限制1中选择*;
线程“main”java.lang.NoClassDefFoundError中出现异常:org/apache/hadoop/mapreduce/task/JobContextImpl
位于java.lang.Class.forName0(本机方法)
位于java.lang.Class.forName(Class.java:270)
位于org.apache.hadoop.hive.ql.metadata.Table.getInputFormatClass(Table.java:302)
位于org.apache.hadoop.hive.ql.metadata.Table.(Table.java:99)
位于org.apache.hadoop.hive.ql.metadata.hive.getTable(hive.java:989)
位于org.apache.hadoop.hive.ql.metadata.hive.getTable(hive.java:892)
位于org.apache.hadoop.hive.ql.parse.semanticalyzer.getMetaData(semanticalyzer.java:1083)
位于org.apache.hadoop.hive.ql.parse.semanticalyzer.getMetaData(semanticalyzer.java:1059)
位于shark.parse.SharkSemanticAnalyzer.analyzeInternal(SharkSemanticAnalyzer.scala:139)
位于org.apache.hadoop.hive.ql.parse.basesemanticalyzer.analyze(basesemanticalyzer.java:279)
编译(SharkDriver.scala:210)
位于org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337)
位于org.apache.hadoop.hive.ql.Driver.run(Driver.java:909)
位于shark.SharkCliDriver.processCmd(SharkCliDriver.scala:328)
位于org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
shark.SharkCliDriver$.main(SharkCliDriver.scala:233)
在shark.SharkCliDriver.main(SharkCliDriver.scala)
原因:java.lang.ClassNotFoundException:org.apache.hadoop.mapreduce.task.JobContextImpl
在java.net.URLClassLoader$1.run(URLClassLoader.java:366)
在java.net.URLClassLoader$1.run(URLClassLoader.java:355)
位于java.security.AccessController.doPrivileged(本机方法)
位于java.net.URLClassLoader.findClass(URLClassLoader.java:354)
位于java.lang.ClassLoader.loadClass(ClassLoader.java:425)
位于sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
位于java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 还有17个

谢谢

我需要查看完整的配置以及您如何设置表格以获得更好的答案。只要看看这个错误,您的类路径上似乎缺少了一些hadoop JAR