Python 必需的位置参数:';num#u特性';错误

Python 必需的位置参数:';num#u特性';错误,python,deep-learning,pytorch,Python,Deep Learning,Pytorch,当我试图运行ResNetarchitecture时,我发现num\u features参数丢失。我想不出问题出在哪里: import torch import torch.nn as nn from numpy.random import normal from numpy.linalg import svd from math import sqrt import torch.nn.init from .common import * class ResidualSequential(nn.

当我试图运行
ResNet
architecture时,我发现
num\u features
参数丢失。我想不出问题出在哪里:

import torch
import torch.nn as nn
from numpy.random import normal
from numpy.linalg import svd
from math import sqrt
import torch.nn.init
from .common import *

class ResidualSequential(nn.Sequential):
    def __init__(self, *args):
        super(ResidualSequential, self).__init__(*args)

    def forward(self, x):
        out = super(ResidualSequential, self).forward(x)
        # print(x.size(), out.size())
        x_ = None
        if out.size(2) != x.size(2) or out.size(3) != x.size(3):
            diff2 = x.size(2) - out.size(2)
            diff3 = x.size(3) - out.size(3)
            # print(1)
            x_ = x[:, :, diff2 /2:out.size(2) + diff2 / 2, diff3 / 2:out.size(3) + diff3 / 2]
        else:
            x_ = x
        return out + x_

    def eval(self):
        print(2)
        for m in self.modules():
            m.eval()
        exit()


def get_block(num_channels, norm_layer, act_fun):
    layers = [
        nn.Conv2d(num_channels, num_channels, 3, 1, 1, bias=False),
        norm_layer(num_channels, affine=True),
        act(act_fun),
        nn.Conv2d(num_channels, num_channels, 3, 1, 1, bias=False),
        norm_layer(num_channels, affine=True),
    ]
    return layers


class ResNet(nn.Module):
    def __init__(self, num_input_channels, num_output_channels, num_blocks, num_channels, need_residual=True, act_fun='LeakyReLU', need_sigmoid=True, norm_layer=nn.BatchNorm2d, pad='reflection'):
        '''
            pad = 'start|zero|replication'
        '''
        super(ResNet, self).__init__()

        if need_residual:
            s = ResidualSequential
        else:
            s = nn.Sequential

        stride = 1
        # First layers
        layers = [
            # nn.ReplicationPad2d(num_blocks * 2 * stride + 3),
            conv(num_input_channels, num_channels, 3, stride=1, bias=True, pad=pad),
            act(act_fun)
        ]
        # Residual blocks
        # layers_residual = []
        for i in range(num_blocks):
            layers += [s(*get_block(num_channels, norm_layer, act_fun))]
       
        layers += [
            nn.Conv2d(num_channels, num_channels, 3, 1, 1),
            norm_layer(num_channels, affine=True)
        ]

        # if need_residual:
        #     layers += [ResidualSequential(*layers_residual)]
        # else:
        #     layers += [Sequential(*layers_residual)]

        # if factor >= 2: 
        #     # Do upsampling if needed
        #     layers += [
        #         nn.Conv2d(num_channels, num_channels *
        #                   factor ** 2, 3, 1),
        #         nn.PixelShuffle(factor),
        #         act(act_fun)
        #     ]
        layers += [
            conv(num_channels, num_output_channels, 3, 1, bias=True, pad=pad),
            nn.Sigmoid()
        ]
        self.model = nn.Sequential(*layers)

    def forward(self, input):
        return self.model(input)

    def eval(self):
        self.model.eval()

错误

---------------------------------------------------------------------------
TypeError回溯(最近一次调用上次)
在()
18净=获取净(输入深度,'ResNet',pad,
19个数字刻度=5,
--->20向上采样模式=‘双线性’。类型(数据类型)
21
22.其他:
2帧
/获取网络中的content/models/\uuuuuuuu init\uuuuuuuuuuuuuuy.py(输入深度、网络类型、pad、上采样模式、n通道、动作乐趣、跳过n33d、跳过n33u、跳过n11、数字缩放、下采样模式)
9如果网络类型='ResNet':
10#待办事项
--->11净=ResNet(输入深度,3,10,16,1,nn.2d,假)
12 elif网络类型==‘跳过’:
13净=跳过(输入深度,n个通道,数字通道向下=[skip_n33d]*如果存在,数字缩放(skip_n33d,int),否则跳过n33d,
/content/models/resnet.py in_uuuuuinit_uuuuuu(self、num_输入通道、num_输出通道、num_块、num_通道、need_剩余、act_乐趣、need_sigmoid、norm_层、pad)
59#nn.复制PAD2D(块数*2*跨步+3),
60 conv(数字输入通道,数字通道,3,步幅=1,偏差=True,pad=pad),
--->61幕(戏趣)
62         ]
63#剩余块
/act中的content/models/common.py(act_fun)
90断言错误
91其他:
--->92返回法_fun()
93
94
TypeError:\uuuu init\uuuu()缺少1个必需的位置参数:“num\u features”
好的,看看,您传递的是一个可调用函数(某种构造函数),而不是字符串。具体来说,您传递的是。换句话说,您的参数在
ResNet
构造函数中未对齐(当您有这么多位置参数时,这是一个常见问题-使用关键字参数!)

问题是:

net = ResNet(input_depth, 3, 10, 16, 1, nn.BatchNorm2d, False)
您将
nn.BatchNorm2d
作为第六个位置参数传递,但是
ResNet
构造函数(忽略
self
)中的第六个参数是
act\u fun

看起来更好的电话应该是:

net = ResNet(input_depth, 3, 10, 16, 1, False, norm_layer=nn.BatchNorm2d)

这仍然有一个额外的位置参数,但我不确定
input\u depth
应该放在哪里。
ResNet
构造函数需要4个整数位置参数(
num\u input\u channels,num\u output\u channels,num\u blocks,num\u channels
,后面是关键字参数,如
need\u resistary
)。即使修复了
norm\u层
,您仍然在布尔值之前传递5个整数参数。

似乎是
act\u-fun
引用的函数是一个构造函数,它接受您没有传递的所需位置参数。很难说得更多,因为
act\u-fun
位于
common.py
中,您可以使用它您没有共享。@AndrewEckart谢谢。但是,
act\u-fun='LeakyReLU'
是在类resnet中传递的。这段代码是从中获取的。如果您想看一看,当我使用您提到的更改运行代码时,我得到了错误
TypeError:'bool'对象不可调用
。您可以看到我正在尝试复制的笔记本,一切都很好对于模型
skip
at
get\u net
但是当我尝试将模型更改为
resnet
时,我遇到了错误@Andrew EckartLike我说过,仍然有一个不匹配的参数。你要传递5个整数:
input\u depth,3,10,16,1
。构造函数需要4个:
num\u input\u channels,num\u output\u channels,num\u blocks,n嗯,频道
。我无法告诉您如何修复它,因为我不知道哪一个是无关的。公平地说,此存储库的作者已将此部分标记为todo,因此我不确定它是否有效:也许可以尝试删除
输入深度
,保留其他四个,看看是否有效。
net = ResNet(input_depth, 3, 10, 16, 1, nn.BatchNorm2d, False)
net = ResNet(input_depth, 3, 10, 16, 1, False, norm_layer=nn.BatchNorm2d)