如何在获取所有文件大小的同一函数中获取总目录大小?(Python)
我使用以下函数从目标目录下获取系统中的所有文件大小如何在获取所有文件大小的同一函数中获取总目录大小?(Python),python,filesystems,operating-system,Python,Filesystems,Operating System,我使用以下函数从目标目录下获取系统中的所有文件大小 def get_files(target): # Get file size and modified time for all files from the target directory and down. # Initialize files list filelist = [] # Walk the directory structure for root, dirs, files in os.
def get_files(target):
# Get file size and modified time for all files from the target directory and down.
# Initialize files list
filelist = []
# Walk the directory structure
for root, dirs, files in os.walk(target):
# Do not walk into directories that are mount points
dirs[:] = filter(lambda dir: not os.path.ismount(os.path.join(root, dir)), dirs)
for name in files:
# Construct absolute path for files
filename = os.path.join(root, name)
# Test the path to account for broken symlinks
if os.path.exists(filename):
# File size information in bytes
size = float(os.path.getsize(filename))
# Get the modified time of the file
mtime = os.path.getmtime(filename)
# Create a tuple of filename, size, and modified time
construct = filename, size, str(datetime.datetime.fromtimestamp(mtime))
# Add the tuple to the master filelist
filelist.append(construct)
return(filelist)
我如何修改它以包含第二个包含目录和目录总大小的列表?我试图在一个函数中包含此操作,希望比在单独的函数中执行第二次遍历来获取目录信息和大小更有效
这样做的目的是能够以前二十大文件的排序列表和前十大目录的第二个排序列表进行报告
谢谢你们的建议。我将目录输出到字典中,而不是列表中,但看看你们是否喜欢:
def get_files(target):
# Get file size and modified time for all files from the target directory and down.
# Initialize files list
filelist = []
dirdict = {}
# Walk the directory structure
for root, dirs, files in os.walk(target):
# Do not walk into directories that are mount points
dirs[:] = filter(lambda dir: not os.path.ismount(os.path.join(root, dir)), dirs)
for name in files:
# Construct absolute path for files
filename = os.path.join(root, name)
# Test the path to account for broken symlinks
if os.path.exists(filename):
# File size information in bytes
size = float(os.path.getsize(filename))
# Get the modified time of the file
mtime = os.path.getmtime(filename)
# Create a tuple of filename, size, and modified time
construct = filename, size, str(datetime.datetime.fromtimestamp(mtime))
# Add the tuple to the master filelist
filelist.append(construct)
if root in dirdict.keys():
dirdict[root] += size
else:
dirdict[root] = size
return(filelist, dirdict)
如果希望dirdict作为元组列表,只需执行以下操作:
dirdict.items()
我有很多脚本可以做这类事情,我刚刚将“bigfiles.py”上传到github 它不计算总的累积目录大小,但可以修改它而不需要太多麻烦 我还有其他代码,可以在给定深度对总目录大小求和,如:
In [7]: t = build_tree_from_directory('/scratch/stu/')
In [8]: pprint.pprint(walk_tree(t,depth=0))
{'name': 'ROOT', 'size': 6539880514}
In [9]: pprint.pprint(walk_tree(t,depth=0))
{'name': 'ROOT', 'size': 6539880514}
In [10]: pprint.pprint(walk_tree(t,depth=1))
{'children': [{'name': 'apache2-gzip', 'size': 112112512},
{'name': 'gitnotes', 'size': 897104422},
{'name': 'finder', 'size': 3810736368},
{'name': 'apache2', 'size': 1719919406}],
'name': 'ROOT'}
In [12]: pprint.pprint(walk_tree(t,depth=2))
{'children': [{'children': [{'name': 'vhost', 'size': 103489662}],
'name': 'apache2-gzip'},
{'children': [{'name': '2', 'size': 533145458},
{'name': 'notes.git', 'size': 363958964}],
'name': 'gitnotes'},
{'children': [{'name': 'gzipped', 'size': 3810736368},
{'name': 'output.txt', 'size': 0}],
'name': 'finder'},
{'children': [{'name': 'sente_combined.log', 'size': 0},
{'name': 'lisp_ssl.log', 'size': 0},
{'name': 'vhost', 'size': 1378778576},
{'name': 'other_vhosts_access.log', 'size': 0},
{'name': 'ssl_error.log', 'size': 0},
{'name': 'ssl_access.log', 'size': 0},
{'name': 'sente_test.log', 'size': 0}],
'name': 'apache2'}],
'name': 'ROOT'}
FS只被爬网一次,但树创建后需要遍历以获得完整的大小,如果您从叶节点开始,朝根方向努力,您可以最有效地计算每个目录的总大小。感谢优雅的内联解决方案。