Deep learning Pytorch全局修剪并没有减少模型的大小

Deep learning Pytorch全局修剪并没有减少模型的大小,deep-learning,computer-vision,pytorch,vision,pruning,Deep Learning,Computer Vision,Pytorch,Vision,Pruning,我试图通过全局修剪来修剪我的深度学习模型。原始的未运行模型约为77.5 MB。但是,在修剪之后,当我保存模型时,模型的大小与原始模型相同。有人能帮我解决这个问题吗 以下是修剪代码:- import torch.nn.utils.prune as prune parameters_to_prune = ( (model.encoder[0], ‘weight’), (model.up_conv1[0], ‘weight’), (model.up_conv2[0], ‘weight’), (mod

我试图通过全局修剪来修剪我的深度学习模型。原始的未运行模型约为77.5 MB。但是,在修剪之后,当我保存模型时,模型的大小与原始模型相同。有人能帮我解决这个问题吗

以下是修剪代码:-

import torch.nn.utils.prune as prune

parameters_to_prune = (
(model.encoder[0], ‘weight’),
(model.up_conv1[0], ‘weight’),
(model.up_conv2[0], ‘weight’),
(model.up_conv3[0], ‘weight’),
)
print(parameters_to_prune)

prune.global_unstructured(
parameters_to_prune,
pruning_method=prune.L1Unstructured,
amount=0.2,
)

print(
“Sparsity in Encoder.weight: {:.2f}%”.format(
100. * float(torch.sum(model.encoder[0].weight == 0))
/ float(model.encoder[0].weight.nelement())
)
)
print(
“Sparsity in up_conv1.weight: {:.2f}%”.format(
100. * float(torch.sum(model.up_conv1[0].weight == 0))
/ float(model.up_conv1[0].weight.nelement())
)
)
print(
“Sparsity in up_conv2.weight: {:.2f}%”.format(
100. * float(torch.sum(model.up_conv2[0].weight == 0))
/ float(model.up_conv2[0].weight.nelement())
)
)
print(
“Sparsity in up_conv3.weight: {:.2f}%”.format(
100. * float(torch.sum(model.up_conv3[0].weight == 0))
/ float(model.up_conv3[0].weight.nelement())
)
)

print(
“Global sparsity: {:.2f}%”.format(
100. * float(
torch.sum(model.encoder[0].weight == 0)
+ torch.sum(model.up_conv1[0].weight == 0)
+ torch.sum(model.up_conv2[0].weight == 0)
+ torch.sum(model.up_conv3[0].weight == 0)
)
/ float(
model.encoder[0].weight.nelement()
+ model.up_conv1[0].weight.nelement()
+ model.up_conv2[0].weight.nelement()
+ model.up_conv3[0].weight.nelement()
)
)
)

**Setting Pruning to Permanent**
prune.remove(model.encoder[0], “weight”)
prune.remove(model.up_conv1[0], “weight”)
prune.remove(model.up_conv2[0], “weight”)
prune.remove(model.up_conv3[0], “weight”)

**Saving the model**
PATH = “C:\PrunedNet.pt”
torch.save(model.state_dict(), PATH)

如果这样应用,修剪不会更改模型大小

如果你有张量,可以这样说:

[1., 2., 3., 4., 5., 6., 7., 8.]
您将删除
50%的数据,例如:

[1., 2., 0., 4., 0., 6., 0., 0.]
您仍将有
8个
浮点值,它们的大小将相同

修剪何时减少模型大小?
  • 当我们以稀疏格式保存权重时,它应该具有高稀疏性(因此10%的非零元素)
  • 当我们实际删除某个东西时(如
    Conv2d
    中的内核),如果它的权重为零或可以忽略不计,则可以删除它)
否则就不行了。例如,查看一些相关的项目,这些项目可以让您不用自己编写代码就可以完成