Deep learning PyTorch没有为';掩蔽';张量?

Deep learning PyTorch没有为';掩蔽';张量?,deep-learning,pytorch,torch,Deep Learning,Pytorch,Torch,我正在PyTorch中为MNIST数据集编写一个LeNet;我添加了一个张量self.mask\u fc1\2\3,以屏蔽整个连接层的某些连接。代码如下所示: import torch import torchvision import torchvision.transforms as transforms import matplotlib.pyplot as plt import numpy as np import torch.nn as nn import torch.nn.funct

我正在PyTorch中为MNIST数据集编写一个LeNet;我添加了一个张量
self.mask\u fc1\2\3
,以屏蔽整个连接层的某些连接。代码如下所示:

import torch
import torchvision
import torchvision.transforms as transforms
import matplotlib.pyplot as plt
import numpy as np
import torch.nn as nn
import torch.nn.functional as F
import torch.optim as optim

def loadMNIST():
    transform = transforms.Compose([transforms.ToTensor()])

    trainset = torchvision.datasets.MNIST(root='./data', train=True,
                                          download=True, transform=transform)
    trainloader = torch.utils.data.DataLoader(trainset, batch_size=4,
                                              shuffle=True, num_workers=2)

    testset = torchvision.datasets.MNIST(root='./data', train=False,
                                         download=True, transform=transform)
    testloader = torch.utils.data.DataLoader(testset, batch_size=4,
                                             shuffle=False, num_workers=2)
    return trainloader, testloader

class Net(nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        self.conv1 = nn.Conv2d(1, 6, 5, 1, 2)
        self.pool = nn.MaxPool2d(2, 2)
        self.conv2 = nn.Conv2d(6, 16, 5)
        self.fc1 = nn.Linear(16 * 5 * 5, 120)
        self.fc2 = nn.Linear(120, 84)
        self.fc3 = nn.Linear(84, 10)

        self.mask_fc1 = torch.ones(16 * 5 * 5, 120, requires_grad=True)
        self.mask_fc2 = torch.ones(120, 84, requires_grad=True)
        self.mask_fc3 = torch.ones(84, 10, requires_grad=True)

    def forward(self, x):
        x = self.pool(F.relu(self.conv1(x)))
        x = self.pool(F.relu(self.conv2(x)))
        x = x.view(-1, self.num_flat_features(x))

        # first layer
        x = x.matmul(self.fc1.weight.t() * self.mask_fc1)
        if self.fc1.bias is not None:
            x += torch.jit._unwrap_optional(self.fc1.bias)
        x = F.relu(x)

        # second layer
        x = x.matmul(self.fc2.weight.t() * self.mask_fc2)
        if self.fc2.bias is not None:
            x += torch.jit._unwrap_optional(self.fc2.bias)
        x = F.relu(x)

        # third layer
        x = x.matmul(self.fc3.weight.t() * self.mask_fc3)
        if self.fc3.bias is not None:
            x += torch.jit._unwrap_optional(self.fc3.bias)

        return x

    def num_flat_features(self, x):
        size = x.size()[1:]
        num_features = 1
        for s in size:
            num_features *= s
        return num_features


if __name__ == '__main__':
    trainloader, testloader = loadMNIST()

    net = Net()

    # train
    criterion = nn.CrossEntropyLoss()
    optimizer = optim.SGD(net.parameters(), lr=0.001, momentum=0.9)

    for epoch in range(2):
        running_loss = 0.0
        for i, data in enumerate(trainloader, 0):
            inputs, labels = data

            optimizer.zero_grad()

            outputs = net(inputs)
            loss = criterion(outputs, labels)
            loss.backward()
            optimizer.step()

            # print
            running_loss += loss.item()
            if i % 2000 == 1999:
                # mean loss
                print('[%d, %5d] loss: %.3f' % (epoch + 1, i + 1, running_loss / 2000))
                running_loss = 0.0

    print('Finished Training')

    # print the mask
    print(net.mask_fc1)
我在
forward
函数中实现了掩蔽,并且自己实现了线性层,而不是调用
x=F.relu(self.fc1(x))
,模型运行正常(最后在损失和精度方面)


但是,当我打印
self.mask_fc1/2/3
时,张量在训练期间不会改变。由于在函数
\uuu init\uuu
中设置了张量
requires_grad=True
,我无法理解为什么它没有改变。可能是因为张量乘法?

对于训练,您需要将
掩码\u fc1/2/3
注册为模块参数:

self.mask\u fc1=nn.参数(火炬式(16*5*5120))


您可以在这之后打印
net.parameters()
进行确认。

目前,我认为这是因为行
optimizer=optim.SGD(net.parameters(),lr=0.001,momentum=0.9)
,这将更新限制为仅更新net参数。但是哪些参数属于
net.parameters()
?我还在函数
中声明掩蔽张量。