Computer vision 如何加载权重标准化模型的权重?

Computer vision 如何加载权重标准化模型的权重?,computer-vision,pytorch,pre-trained-model,Computer Vision,Pytorch,Pre Trained Model,使用此链接:从pytorch分类文件夹中,我使用resnet50执行分类任务 以下是我所做的-从这里复制代码: 而did-model=l_resnet50() 现在我们有了同时使用gn和ws的resnet50模型,但当时我正试图加载那个repo的作者提供的resnet50权重文件 但我得到了这个错误: RuntimeError: Error(s) in loading state_dict for ResNet: Missing key(s) in state_dict: "l

使用此链接:从pytorch分类文件夹中,我使用resnet50执行分类任务

以下是我所做的-从这里复制代码:

而did-model=l_resnet50() 现在我们有了同时使用gn和ws的resnet50模型,但当时我正试图加载那个repo的作者提供的resnet50权重文件

但我得到了这个错误:

RuntimeError: Error(s) in loading state_dict for ResNet:
    Missing key(s) in state_dict: "layer3.6.conv1.weight", "layer3.6.bn1.weight", "layer3.6.bn1.bias", "layer3.6.conv2.weight", "layer3.6.bn2.weight", "layer3.6.bn2.bias", "layer3.6.conv3.weight", "layer3.6.bn3.weight", "layer3.6.bn3.bias", "layer3.7.conv1.weight", "layer3.7.bn1.weight", "layer3.7.bn1.bias", "layer3.7.conv2.weight", "layer3.7.bn2.weight", "layer3.7.bn2.bias", "layer3.7.conv3.weight", "layer3.7.bn3.weight", "layer3.7.bn3.bias", "layer3.8.conv1.weight", "layer3.8.bn1.weight", "layer3.8.bn1.bias", "layer3.8.conv2.weight", "layer3.8.bn2.weight", "layer3.8.bn2.bias", "layer3.8.conv3.weight", "layer3.8.bn3.weight", "layer3.8.bn3.bias", "layer3.9.conv1.weight", "layer3.9.bn1.weight", "layer3.9.bn1.bias", "layer3.9.conv2.weight", "layer3.9.bn2.weight", "layer3.9.bn2.bias", "layer3.9.conv3.weight", "layer3.9.bn3.weight", "layer3.9.bn3.bias", "layer3.10.conv1.weight", "layer3.10.bn1.weight", "layer3.10.bn1.bias", "layer3.10.conv2.weight", "layer3.10.bn2.weight", "layer3.10.bn2.bias", "layer3.10.conv3.weight", "layer3.10.bn3.weight", "layer3.10.bn3.bias", "layer3.11.conv1.weight", "layer3.11.bn1.weight", "layer3.11.bn1.bias", "layer3.11.conv2.weight", "layer3.11.bn2.weight", "layer3.11.bn2.bias", "layer3.11.conv3.weight", "layer3.11.bn3.weight", "layer3.11.bn3.bias", "layer3.12.conv1.weight", "layer3.12.bn1.weight", "layer3.12.bn1.bias", "layer3.12.conv2.weight", "layer3.12.bn2.weight", "layer3.12.bn2.bias", "layer3.12.conv3.weight", "layer3.12.bn3.weight", "layer3.12.bn3.bias", "layer3.13.conv1.weight", "layer3.13.bn1.weight", "layer3.13.bn1.bias", "layer3.13.conv2.weight", "layer3.13.bn2.weight", "layer3.13.bn2.bias", "layer3.13.conv3.weight", "layer3.13.bn3.weight", "layer3.13.bn3.bias", "layer3.14.conv1.weight", "layer3.14.bn1.weight", "layer3.14.bn1.bias", "layer3.14.conv2.weight", "layer3.14.bn2.weight", "layer3.14.bn2.bias", "layer3.14.conv3.weight", "layer3.14.bn3.weight", "layer3.14.bn3.bias", "layer3.15.conv1.weight", "layer3.15.bn1.weight", "layer3.15.bn1.bias", "layer3.15.conv2.weight", "layer3.15.bn2.weight", "layer3.15.bn2.bias", "layer3.15.conv3.weight", "layer3.15.bn3.weight", "layer3.15.bn3.bias", "layer3.16.conv1.weight", "layer3.16.bn1.weight", "layer3.16.bn1.bias", "layer3.16.conv2.weight", "layer3.16.bn2.weight", "layer3.16.bn2.bias", "layer3.16.conv3.weight", "layer3.16.bn3.weight", "layer3.16.bn3.bias", "layer3.17.conv1.weight", "layer3.17.bn1.weight", "layer3.17.bn1.bias", "layer3.17.conv2.weight", "layer3.17.bn2.weight", "layer3.17.bn2.bias", "layer3.17.conv3.weight", "layer3.17.bn3.weight", "layer3.17.bn3.bias", "layer3.18.conv1.weight", "layer3.18.bn1.weight", "layer3.18.bn1.bias", "layer3.18.conv2.weight", "layer3.18.bn2.weight", "layer3.18.bn2.bias", "layer3.18.conv3.weight", "layer3.18.bn3.weight", "layer3.18.bn3.bias", "layer3.19.conv1.weight", "layer3.19.bn1.weight", "layer3.19.bn1.bias", "layer3.19.conv2.weight", "layer3.19.bn2.weight", "layer3.19.bn2.bias", "layer3.19.conv3.weight", "layer3.19.bn3.weight", "layer3.19.bn3.bias", "layer3.20.conv1.weight", "layer3.20.bn1.weight", "layer3.20.bn1.bias", "layer3.20.conv2.weight", "layer3.20.bn2.weight", "layer3.20.bn2.bias", "layer3.20.conv3.weight", "layer3.20.bn3.weight", "layer3.20.bn3.bias", "layer3.21.conv1.weight", "layer3.21.bn1.weight", "layer3.21.bn1.bias", "layer3.21.conv2.weight", "layer3.21.bn2.weight", "layer3.21.bn2.bias", "layer3.21.conv3.weight", "layer3.21.bn3.weight", "layer3.21.bn3.bias", "layer3.22.conv1.weight", "layer3.22.bn1.weight", "layer3.22.bn1.bias", "layer3.22.conv2.weight", "layer3.22.bn2.weight", "layer3.22.bn2.bias", "layer3.22.conv3.weight", "layer3.22.bn3.weight", "layer3.22.bn3.bias". 
    Unexpected key(s) in state_dict: "bn1.running_mean", "bn1.running_var", "layer1.0.bn1.running_mean", "layer1.0.bn1.running_var", "layer1.0.bn2.running_mean", "layer1.0.bn2.running_var", "layer1.0.bn3.running_mean", "layer1.0.bn3.running_var", "layer1.0.downsample.1.running_mean", "layer1.0.downsample.1.running_var", "layer1.1.bn1.running_mean", "layer1.1.bn1.running_var", "layer1.1.bn2.running_mean", "layer1.1.bn2.running_var", "layer1.1.bn3.running_mean", "layer1.1.bn3.running_var", "layer1.2.bn1.running_mean", "layer1.2.bn1.running_var", "layer1.2.bn2.running_mean", "layer1.2.bn2.running_var", "layer1.2.bn3.running_mean", "layer1.2.bn3.running_var", "layer2.0.bn1.running_mean", "layer2.0.bn1.running_var", "layer2.0.bn2.running_mean", "layer2.0.bn2.running_var", "layer2.0.bn3.running_mean", "layer2.0.bn3.running_var", "layer2.0.downsample.1.running_mean", "layer2.0.downsample.1.running_var", "layer2.1.bn1.running_mean", "layer2.1.bn1.running_var", "layer2.1.bn2.running_mean", "layer2.1.bn2.running_var", "layer2.1.bn3.running_mean", "layer2.1.bn3.running_var", "layer2.2.bn1.running_mean", "layer2.2.bn1.running_var", "layer2.2.bn2.running_mean", "layer2.2.bn2.running_var", "layer2.2.bn3.running_mean", "layer2.2.bn3.running_var", "layer2.3.bn1.running_mean", "layer2.3.bn1.running_var", "layer2.3.bn2.running_mean", "layer2.3.bn2.running_var", "layer2.3.bn3.running_mean", "layer2.3.bn3.running_var", "layer3.0.bn1.running_mean", "layer3.0.bn1.running_var", "layer3.0.bn2.running_mean", "layer3.0.bn2.running_var", "layer3.0.bn3.running_mean", "layer3.0.bn3.running_var", "layer3.0.downsample.1.running_mean", "layer3.0.downsample.1.running_var", "layer3.1.bn1.running_mean", "layer3.1.bn1.running_var", "layer3.1.bn2.running_mean", "layer3.1.bn2.running_var", "layer3.1.bn3.running_mean", "layer3.1.bn3.running_var", "layer3.2.bn1.running_mean", "layer3.2.bn1.running_var", "layer3.2.bn2.running_mean", "layer3.2.bn2.running_var", "layer3.2.bn3.running_mean", "layer3.2.bn3.running_var", "layer3.3.bn1.running_mean", "layer3.3.bn1.running_var", "layer3.3.bn2.running_mean", "layer3.3.bn2.running_var", "layer3.3.bn3.running_mean", "layer3.3.bn3.running_var", "layer3.4.bn1.running_mean", "layer3.4.bn1.running_var", "layer3.4.bn2.running_mean", "layer3.4.bn2.running_var", "layer3.4.bn3.running_mean", "layer3.4.bn3.running_var", "layer3.5.bn1.running_mean", "layer3.5.bn1.running_var", "layer3.5.bn2.running_mean", "layer3.5.bn2.running_var", "layer3.5.bn3.running_mean", "layer3.5.bn3.running_var", "layer4.0.bn1.running_mean", "layer4.0.bn1.running_var", "layer4.0.bn2.running_mean", "layer4.0.bn2.running_var", "layer4.0.bn3.running_mean", "layer4.0.bn3.running_var", "layer4.0.downsample.1.running_mean", "layer4.0.downsample.1.running_var", "layer4.1.bn1.running_mean", "layer4.1.bn1.running_var", "layer4.1.bn2.running_mean", "layer4.1.bn2.running_var", "layer4.1.bn3.running_mean", "layer4.1.bn3.running_var", "layer4.2.bn1.running_mean", "layer4.2.bn1.running_var", "layer4.2.bn2.running_mean", "layer4.2.bn2.running_var", "layer4.2.bn3.running_mean", "layer4.2.bn3.running_var". 

然后,我尝试将该权重文件加载到torchvision的resnet50模型中,它在那里运行良好,因此很明显,错误来自使用gn和ws修改的resnet50,但我如何使用作者的预训练权重呢?作者在其回购协议中共享了预训练权重链接,我无法在他设计的gn,ws-resnet50模型中加载该链接,我只能在gn,ws中使用他的resnet50,并且没有他提供的resnet50权重文件,这意味着我无法加载他共享的权重,我在哪里出错


从此处收集resnet50重量:

此错误中存在两个问题:

缺少键
:这些键(每个层的名称)是在模型的
\uuuu init\uuuu()
中指定的,但在保存的文件中不存在

意外关键点
:在保存的文件中找到的关键点在您的模型中不存在。确保将模型的图层命名为与保存时相同的名称

举个简单的例子:

class Model(nn.Module)
   def __init__(**kwargs):
   super(Model).__init__()
   self.linear1 = nn.Linear(3,3)
   self.linear2 = nn.Linear(3,1)

状态下,层保存为
linear1.weight、linear2.weight、linear1.bias、linear2.bias
。如果更改
\uuu init\uuuu()
中图层的名称,您将遇到此错误。

我知道我的问题是:我从这里使用模型代码:从这里使用模型权重:这些权重是针对那些模型的,对吗?两者都在同一回购协议中!!!我没有更改任何代码,我只是使用了该回购协议中的代码,并试图使用该回购协议中的权重文件,但出现了错误!!我不明白我的错误在哪里。你可以在这里看到作者使用GN+WS方法训练的每个模型的共享权重文件:这些权重的模型代码在这里:我错了吗?那为什么我在加载这些权重时会出错呢?这是我的问题,谢谢你,我帮不了你。在这种情况下,我建议您在存储库中打开并发布?也许作者们能对此有所启发。