Python 为什么我消耗的内存比GoogleColab中数据帧的大小还要多
我很难理解为什么我消耗的内存远远超过Google Colab中Python 为什么我消耗的内存比GoogleColab中数据帧的大小还要多,python,pandas,dataframe,google-colaboratory,feather,Python,Pandas,Dataframe,Google Colaboratory,Feather,我很难理解为什么我消耗的内存远远超过Google Colab中.info(memory\u usage=“deep”)中显示的实际内存使用量 # ... some other code utils.print_available_ram() train_dataset_raw = panda.read_feather(train_dataset_path) utils.print_available_ram() log(train_dataset_raw.info(memory_usage=&q
.info(memory\u usage=“deep”)
中显示的实际内存使用量
# ... some other code
utils.print_available_ram()
train_dataset_raw = panda.read_feather(train_dataset_path)
utils.print_available_ram()
log(train_dataset_raw.info(memory_usage="deep"))
return
我有一个实用功能,可以打印一些关于ram的信息
# module utils.py
def print_available_ram():
MB = 1024 * 1024
ram_available = psutil.virtual_memory().available
ram_tot = psutil.virtual_memory().total
print("Ram available:", ram_available / MB)
print("Ram tot :", ram_tot / MB)
print(" ", (ram_available / ram_tot) * 100, "%")
我在GoogleColab中运行以下代码
# ... some other code
utils.print_available_ram()
train_dataset_raw = panda.read_feather(train_dataset_path)
utils.print_available_ram()
log(train_dataset_raw.info(memory_usage="deep"))
return
输出
# intended as MB
Ram available: 12059.6640625
Ram tot : 13021.0625
92.61658994801691 %
Ram available: 2723.29296875
Ram tot : 13021.0625
20.91452190441448 %
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 2164804 entries, 0 to 2164803
Columns: 281 entries, 0-60 to target
dtypes: float64(281)
memory usage: 4.5 GB
#预期为MB
可用内存:12059.6640625
Ram总计:13021.0625
92.61658994801691%
可用内存:2723.29296875
Ram总计:13021.0625
20.91452190441448%
范围索引:2164804个条目,0到2164803
列:281个条目,目标为0-60
数据类型:float64(281)
内存使用率:4.5 GB
- 打印\u可用\u ram显示的信息是否不可靠
- read_feather正在占用而不是释放一些额外的内存吗