Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/clojure/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Hadoop流媒体“;超出总承包商间接费用限额;_Hadoop_Out Of Memory_Hadoop Streaming - Fatal编程技术网

Hadoop流媒体“;超出总承包商间接费用限额;

Hadoop流媒体“;超出总承包商间接费用限额;,hadoop,out-of-memory,hadoop-streaming,Hadoop,Out Of Memory,Hadoop Streaming,我正在运行以下命令: hadoop jar hadoop-streaming.jar -D stream.tmpdir=/tmp -input "<input dir>" -output "<output dir>" -mapper "grep 20151026" -reducer "wc -l" hadoop-jar hadoop-streaming.jar-D stream.tmpdir=/tmp-input'-output'-mapper“grep 201510

我正在运行以下命令:

hadoop jar hadoop-streaming.jar -D stream.tmpdir=/tmp -input "<input dir>"  -output "<output dir>" -mapper "grep 20151026" -reducer "wc -l"
hadoop-jar hadoop-streaming.jar-D stream.tmpdir=/tmp-input'-output'-mapper“grep 20151026”-reducer“wc-l”
其中
是一个包含许多
avro
文件的目录

得到这个错误:

线程“main”java.lang.OutOfMemoryError中出现异常:GC开销 超出的限制为 org.apache.hadoop.hdfs.protocol.DatanodeID.updateXferAddrAndInvalidateHashCode(DatanodeID.java:287) 在 org.apache.hadoop.hdfs.protocol.DatanodeID.(DatanodeID.java:91) 在 org.apache.hadoop.hdfs.protocol.DatanodeInfo.(DatanodeInfo.java:136) 在 org.apache.hadoop.hdfs.protocol.DatanodeInfo.(DatanodeInfo.java:122) 在 org.apache.hadoop.hdfs.protocolPB.PBHelper.convert(PBHelper.java:633) 在 org.apache.hadoop.hdfs.protocolPB.PBHelper.convert(PBHelper.java:793) 在 org.apache.hadoop.hdfs.protocolPB.PBHelper.convertLocatedBlock(PBHelper.java:1252) 在 org.apache.hadoop.hdfs.protocolPB.PBHelper.convert(PBHelper.java:1270) 在 org.apache.hadoop.hdfs.protocolPB.PBHelper.convert(PBHelper.java:1413) 在 org.apache.hadoop.hdfs.protocolPB.PBHelper.convert(PBHelper.java:1524) 在 org.apache.hadoop.hdfs.protocolPB.PBHelper.convert(PBHelper.java:1533) 在 org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getListing(ClientNamenodeProtocolTranslatorPB.java:557) 位于的sun.reflect.GeneratedMethodAccessor3.invoke(未知源) sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 位于java.lang.reflect.Method.invoke(Method.java:601) org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) 在 org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) 位于com.sun.proxy.$Proxy15.getListing(未知源) org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:1969)位于 org.apache.hadoop.hdfs.DistributedFileSystem$DirListingIterator.hasNextNoFilter(DistributedFileSystem.java:888) 在 org.apache.hadoop.hdfs.DistributedFileSystem$DirListingIterator.hasNext(DistributedFileSystem.java:863) 在 org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:267) 在 org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:228) 在 org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:313) 在 org.apache.hadoop.mapreduce.jobsmitter.writeldsplits(jobsmitter.java:624) 在 org.apache.hadoop.mapreduce.jobsmitter.writeSplits(jobsmitter.java:616) 在 org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:492) 位于org.apache.hadoop.mapreduce.Job$10.run(Job.java:1296) org.apache.hadoop.mapreduce.Job$10.run(Job.java:1293)位于 java.security.AccessController.doPrivileged(本机方法)位于 javax.security.auth.Subject.doAs(Subject.java:415)


如何解决这个问题?

花了一段时间,但我找到了解决办法

将HADOOP\u CLIENT\u OPTS=“-Xmx1024M”预先添加到命令可以解决问题

最后一条命令行是:

HADOOP_CLIENT_OPTS="-Xmx1024M" hadoop jar hadoop-streaming.jar -D stream.tmpdir=/tmp -input "<input dir>"  -output "<output dir>" -mapper "grep 20151026" -reducer "wc -l"
HADOOP\u CLIENT\u OPTS=“-Xmx1024M”HADOOP jar HADOOP-streaming.jar-D stream.tmpdir=/tmp-input”“-output”“-mapper“grep 20151026”-reducer“wc-l”