Warning: file_get_contents(/data/phpspider/zhask/data//catemap/1/php/258.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Hadoop 将数据通过缓存写入alluxio失败_Hadoop_Caching_Mapreduce_In Memory_Alluxio - Fatal编程技术网

Hadoop 将数据通过缓存写入alluxio失败

Hadoop 将数据通过缓存写入alluxio失败,hadoop,caching,mapreduce,in-memory,alluxio,Hadoop,Caching,Mapreduce,In Memory,Alluxio,我正在尝试使用map reduce将数据写入alluxio。我在向alluxio写入的hdfs上有大约11Gig的数据。它与MUST_CACHE write type(alluxio.user.file.writetype.default的默认值)配合得很好 但当我尝试使用CACHE_-THROUGH写入时,它失败了,出现以下异常: Error: alluxio.exception.status.UnavailableException: Channel to <hostname o

我正在尝试使用map reduce将数据写入alluxio。我在向alluxio写入的hdfs上有大约11Gig的数据。它与MUST_CACHE write type(alluxio.user.file.writetype.default的默认值)配合得很好

但当我尝试使用CACHE_-THROUGH写入时,它失败了,出现以下异常:

   Error: alluxio.exception.status.UnavailableException: Channel to <hostname of one of the  worker>:29999: <underfs path to file> (No such file or directory)
            at alluxio.client.block.stream.NettyPacketWriter.close(NettyPacketWriter.java:263)
            at com.google.common.io.Closer.close(Closer.java:206)
            at alluxio.client.block.stream.BlockOutStream.close(BlockOutStream.java:166)
            at alluxio.client.file.FileOutStream.close(FileOutStream.java:137)
            at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:72)
            at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:106)
            at org.apache.hadoop.mapreduce.lib.output.TextOutputFormat$LineRecordWriter.close(TextOutputFormat.java:111)
            at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:679)
            at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:802)
            at org.apache.hadoop.mapred.MapTask.run(MapTask.java:346)
            at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
            at java.security.AccessController.doPrivileged(Native Method)
            at javax.security.auth.Subject.doAs(Subject.java:422)
            at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1595)
            at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
    Caused by: alluxio.exception.status.NotFoundException: Channel to <hostname of one of the  worker>29999: <underfs path to file> (No such file or directory)
            at alluxio.exception.status.AlluxioStatusException.from(AlluxioStatusException.java:153)
            at alluxio.util.CommonUtils.unwrapResponseFrom(CommonUtils.java:548)
            at alluxio.client.block.stream.NettyPacketWriter$PacketWriteHandler.channelRead(NettyPacketWriter.java:367)
            at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
            at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
            at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)
            at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
            at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
            at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:254)
            at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
            at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
            at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
            at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
            at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
            at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:163)
            at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
            at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
            at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:787)
            at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:130)
            at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
            at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
            at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
            at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
            at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
            at java.lang.Thread.run(Thread.java:748)
错误:alluxio.exception.status.UnavailableException:Channel to:29999:(没有这样的文件或目录)
位于alluxio.client.block.stream.NettyPacketWriter.close(NettyPacketWriter.java:263)
位于com.google.common.io.Closer.close(Closer.java:206)
在alluxio.client.block.stream.BlockOutStream.close(BlockOutStream.java:166)
在alluxio.client.file.FileOutStream.close(FileOutStream.java:137)
位于org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:72)
位于org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:106)
位于org.apache.hadoop.mapreduce.lib.output.TextOutputFormat$LineRecordWriter.close(TextOutputFormat.java:111)
位于org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:679)
位于org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:802)
位于org.apache.hadoop.mapred.MapTask.run(MapTask.java:346)
位于org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
位于java.security.AccessController.doPrivileged(本机方法)
位于javax.security.auth.Subject.doAs(Subject.java:422)
位于org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1595)
位于org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
原因:alluxio.exception.status.NotFoundException:Channel to 29999:(无此类文件或目录)
位于alluxio.exception.status.AlluxioStatusException.from(AlluxioStatusException.java:153)
位于alluxio.util.CommonUtils.unwrapResponseFrom(CommonUtils.java:548)
位于alluxio.client.block.stream.nettypackettwitter$PacketWriteHandler.channelRead(nettypackettwitter.java:367)
位于io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
位于io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
在io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)
位于io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
位于io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
在io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:254)处
位于io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
位于io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
位于io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
位于io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
位于io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
位于io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:163)
位于io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
位于io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
位于io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:787)
在io.netty.channel.nio.AbstractNioByteChannel$niobytuensafe.read(AbstractNioByteChannel.java:130)
位于io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
在io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized处(NioEventLoop.java:468)
位于io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
位于io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
位于io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
运行(Thread.java:748)
我也尝试了以下命令,得到了相同的错误:

./alluxio fs -Dalluxio.user.file.writetype.default=CACHE_THROUGH copyFromLocal <hdfs_input_path> <alluxio_output_path>
/alluxio fs-Dalluxio.user.file.writetype.default=CACHE\u通过copyFromLocal

如有任何帮助/建议,将不胜感激。感谢使用
copyFromLocal
shell命令只能复制本地文件系统上可用的文件。要将文件从HDFS复制到Alluxio,可以先将文件复制到本地计算机,然后将文件写入Alluxio

hdfs dfs -get <hdfs_input_path> /tmp/tmp_file
alluxio fs copyFromLocal /tmp/tmp_file <alluxio_output_path>
,使用
-libjars/path/to/client
将Alluxio客户端jar添加到应用程序类路径中,并写入
alluxio://master_hostname:19998/alluxio_output_path
URI。有关更多详细信息,请参阅

<property>
  <name>fs.alluxio.impl</name>
  <value>alluxio.hadoop.FileSystem</value>
  <description>The Alluxio FileSystem (Hadoop 1.x and 2.x)</description>
</property>
<property>
  <name>fs.AbstractFileSystem.alluxio.impl</name>
  <value>alluxio.hadoop.AlluxioFileSystem</value>
  <description>The Alluxio AbstractFileSystem (Hadoop 2.x)</description>
</property>