Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/311.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 转储不适用于pigrunner_Java_Hadoop_Apache Pig - Fatal编程技术网

Java 转储不适用于pigrunner

Java 转储不适用于pigrunner,java,hadoop,apache-pig,Java,Hadoop,Apache Pig,以下是我运行pigrunner和pigstats的代码: String[] args = {"abc.pig"}; PigStats stats = PigRunner.run(args,null); System.out.println("Stats : " + stats.getReturnCode()); OutputStats os = stats.result("B"); Iterator<Tuple> it = os.iter

以下是我运行pigrunner和pigstats的代码:

    String[] args = {"abc.pig"};
    PigStats stats = PigRunner.run(args,null);

    System.out.println("Stats : " + stats.getReturnCode());

    OutputStats os = stats.result("B");

    Iterator<Tuple> it = os.iterator();

    while(it.hasNext()){
        Tuple t = it.next();
        System.out.println(t.getAll());
    }
我得到了正确的输出,但随后出现了此异常Stacktrace,并带有根本原因

org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: hdfs://localhost:54310/tmp/temp-221133443/tmp1478461116
  at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:235)
  at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigFileInputFormat.listStatus(PigFileInputFormat.java:37)
  at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:252)
  at org.apache.pig.impl.io.ReadToEndLoader.init(ReadToEndLoader.java:154)
  at org.apache.pig.impl.io.ReadToEndLoader.<init>(ReadToEndLoader.java:116)
  at org.apache.pig.tools.pigstats.OutputStats.iterator(OutputStats.java:148)
  at org.apache.jsp.result_jsp._jspService(result_jsp.java:86)
  at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:70)
  at javax.servlet.http.HttpServlet.service(HttpServlet.java:722)
  at org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:419)
  at org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:391)
  at org.apache.jasper.servlet.JspServlet.service(JspServlet.java:334)
  at javax.servlet.http.HttpServlet.service(HttpServlet.java:722)
  at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:304)
  at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
  at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:240)
  at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:164)
  at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:462)
  at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:164)
  at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:100)
  at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:562)
  at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118)
  at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:395)
  at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:250)
  at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:188)
  at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:166)
  at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:302)
  at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
  at java.lang.Thread.run(Thread.java:662)
org.apache.hadoop.mapreduce.lib.input.InvalidInputException:输入路径不存在:hdfs://localhost:54310/tmp/temp-22113443/tmp1478461116
位于org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:235)
位于org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigFileInputFormat.listStatus(PigFileInputFormat.java:37)
位于org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:252)
位于org.apache.pig.impl.io.ReadToEndLoader.init(ReadToEndLoader.java:154)
位于org.apache.pig.impl.io.ReadToEndLoader。(ReadToEndLoader.java:116)
位于org.apache.pig.tools.pigstats.OutputStats.iterator(OutputStats.java:148)
位于org.apache.jsp.result\u jsp.\u jsp服务(result\u jsp.java:86)
位于org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:70)
位于javax.servlet.http.HttpServlet.service(HttpServlet.java:722)
位于org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:419)
位于org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:391)
位于org.apache.jasper.servlet.JspServlet.service(JspServlet.java:334)
位于javax.servlet.http.HttpServlet.service(HttpServlet.java:722)
位于org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:304)
位于org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
位于org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:240)
位于org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:164)
位于org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:462)
位于org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:164)
位于org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:100)
位于org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:562)
位于org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118)
位于org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:395)
位于org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:250)
位于org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:188)
位于org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:166)
位于org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:302)
位于java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
位于java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
运行(Thread.java:662)
现在,如果我用存储替换转储,同样的代码可以正常工作

能给我解释一下发生了什么事吗

谢谢
Ravi

如果是倾卸清管器,则将输出存储在临时位置,例如:hdfs://localhost/tmp/temp797130848/tmp1101984728 (在作业的config.xml中查看
pig.map.output.dirs

在迭代结果元组并将其打印到控制台的流程的某个点调用:

Iterator<Tuple> result = mPigServer.openIterator(alias);
while (result.hasNext())
{
  Tuple t = result.next();
  System.out.println(TupleFormat.format(t));
}
Iterator result=mPigServer.openIterator(别名);
while(result.hasNext())
{
Tuple t=result.next();
System.out.println(TupleFormat.format(t));
}
在此之后,但在返回之前,它还会调用删除此临时目录的

现在要返回别名B的结果。 的迭代器尝试再次打开临时文件,以循环遍历元组,正如以前所做的那样。 但是问题是这个文件已经不存在了,因此您将获得异常


因此,我建议您删除
System.out.println(“Stats:+Stats.getReturnCode())之后的代码因为您已经打印了转储。

感谢您的精彩解释,在我的例子中,我通过将PigProgressNotificationListener传递给PigRunner解决了这个问题,当任务完成时,我得到了一个包含输出的OutputStats对象。
Iterator<Tuple> result = mPigServer.openIterator(alias);
while (result.hasNext())
{
  Tuple t = result.next();
  System.out.println(TupleFormat.format(t));
}