如何将文件从Hdfs强制复制到linux文件系统?

如何将文件从Hdfs强制复制到linux文件系统?,linux,hadoop,hdfs,Linux,Hadoop,Hdfs,对于命令,-copyFromLocal有一个带有-f的选项,该选项将强制将数据从本地文件系统复制到Hdfs。与-copyToLocal选项类似,我尝试了-f选项,但没有成功。那么,有谁能在这方面指导我吗 谢谢 Karthik对于复制本地 $ hadoop fs -help Usage: hadoop fs [generic options] [-appendToFile <localsrc> ... <dst>] [-cat [-ignoreCrc]

对于命令,
-copyFromLocal
有一个带有
-f
的选项,该选项将强制将数据从本地文件系统复制到Hdfs。与
-copyToLocal
选项类似,我尝试了
-f
选项,但没有成功。那么,有谁能在这方面指导我吗

谢谢


Karthik

对于
复制本地

$ hadoop fs -help

Usage: hadoop fs [generic options]

    [-appendToFile <localsrc> ... <dst>]

    [-cat [-ignoreCrc] <src> ...]

    [-checksum <src> ...]

    [-chgrp [-R] GROUP PATH...]

    [-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...]

    [-chown [-R] [OWNER][:[GROUP]] PATH...]

    [-copyFromLocal [-f] [-p] <localsrc> ... <dst>]

    [-copyToLocal [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]

    [-count [-q] <path> ...]

    [-cp [-f] [-p | -p[topax]] <src> ... <dst>]

    [-createSnapshot <snapshotDir> [<snapshotName>]]

    [-deleteSnapshot <snapshotDir> <snapshotName>]

    [-df [-h] [<path> ...]]

    [-du [-s] [-h] <path> ...]

    [-expunge]

    [-get [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]

    [-getfacl [-R] <path>]

    [-getfattr [-R] {-n name | -d} [-e en] <path>]

    [-getmerge [-nl] <src> <localdst>]

    [-help [cmd ...]]

    [-ls [-d] [-h] [-R] [<path> ...]]

    [-mkdir [-p] <path> ...]

    [-moveFromLocal <localsrc> ... <dst>]

    [-moveToLocal <src> <localdst>]

    [-mv <src> ... <dst>]

    [-put [-f] [-p] <localsrc> ... <dst>]

    [-renameSnapshot <snapshotDir> <oldName> <newName>]

    [-rm [-f] [-r|-R] [-skipTrash] <src> ...]

    [-rmdir [--ignore-fail-on-non-empty] <dir> ...]

    [-setfacl [-R] [{-b|-k} {-m|-x <acl_spec>} <path>]|[--set <acl_spec> <path>]]

    [-setfattr {-n name [-v value] | -x name} <path>]

    [-setrep [-R] [-w] <rep> <path> ...]

    [-stat [format] <path> ...]

    [-tail [-f] <file>]

    [-test -[defsz] <path>]

    [-text [-ignoreCrc] <src> ...]

    [-touchz <path> ...]
$hadoop fs-帮助
用法:hadoop fs[通用选项]
[-appendToFile…]
[-cat[-ignoreCrc]…]
[-校验和…]
[-chgrp[-R]组路径…]
[-chmod[-R]路径…]
[-chown[-R][OWNER][:[GROUP]]路径…]
[-copyFromLocal[-f][-p]…]
[-copyToLocal[-p][-ignoreCrc][-crc]…]
[-count[-q]…]
[-cp[-f][-p |-p[topax]].]
[-createSnapshot[]]
[-deleteSnapshot]
[-df[-h][…]
[-du[-s][-h]…]
[-删除]
[-get[-p][-ignoreCrc][-crc]…]
[-getfacl[-R]]
[-getfattr[-R]{-n name |-d}[-e en]]
[-getmerge[-nl]]
[-help[cmd…]]
[-ls[-d][-h][-R][…]
[-mkdir[-p]…]
[-moveFromLocal…]
[-移动到本地]
[-mv…]
[-put[-f][-p]…]
[-renameSnapshot]
[-rm[-f][-r |-r][-skipTrash]…]
[-rmdir[--ignore fail on non-empty]…]

[-setfacl[-R][{-b |-k}{-m |-x

在使用
-copyToLocal
之前,可能只需在本地删除文件即可?