Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/oop/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 扩展pynn.Sequential类_Python_Oop_Neural Network_Pytorch - Fatal编程技术网

Python 扩展pynn.Sequential类

Python 扩展pynn.Sequential类,python,oop,neural-network,pytorch,Python,Oop,Neural Network,Pytorch,我对Python中的OOP非常陌生,而且通常都很生疏。我想扩展PyTorch的“nn.Sequential”对象,使其传递一个包含每层中节点数的元组,自动根据这些节点生成OrderedDict。对于功能示例: layers = (784, 392, 196, 98, 10) n_layers = len(layers) modules = OrderedDict() # Layer definitions for inner layers: for i in range(n_layers -

我对Python中的OOP非常陌生,而且通常都很生疏。我想扩展PyTorch的“nn.Sequential”对象,使其传递一个包含每层中节点数的元组,自动根据这些节点生成OrderedDict。对于功能示例:

layers = (784, 392, 196, 98, 10)
n_layers = len(layers)
modules = OrderedDict()

# Layer definitions for inner layers:
for i in range(n_layers - 2):
    modules[f'fc{i}']   = nn.Linear(layers[i], layers[i+1])
    modules[f'relu{i}'] = nn.ReLU()

# Definition for output layer:
modules['fc_out'] = nn.Linear(layers[-2], layers[-1])
modules['smax_out'] = nn.LogSoftmax(dim=1)

# Define model and check attributes:
model = nn.Sequential(modules)
因此,在初始化nn.Sequential时,我希望我的类采用元组,而不是传递'OrderedDict'对象

class Network(nn.Sequential):
   def__init__(self, n_nodes):
      super().__init__()

      **** INSERT LOGIC FROM LAST SNIPPET ***
所以这似乎不起作用,因为当我的网络类调用super.\uuuu init\uuuuuu时,它会想要查看层激活的字典。我该如何编写自己的网络,使它能够绕过这个问题,但仍然拥有PyTorche的sequential object的所有功能

我的想法是这样的:

class Network(nn.Sequential):
    def __init__(self, layers):
        super().__init__(self.init_modules(layers))


    def init_modules(self, layers):
        n_layers = len(layers)
        modules = OrderedDict()

        # Layer definitions for inner layers:
        for i in range(n_layers - 2):
            modules[f'fc{i}']   = nn.Linear(layers[i], layers[i+1])
            modules[f'relu{i}'] = nn.ReLU()

        # Definition for output layer:
        modules['fc_out'] = nn.Linear(layers[-2], layers[-1])
        modules['smax_out'] = nn.LogSoftmax(dim=1)

        return modules

我不确定在Python中是否允许和/或良好的实践

您的实现是允许的,而且很好

而且,您还可以初始化super.\uuuu init\uuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuu空,然后使用self.add\u modulekey,循环中的module来附加线性或Relu或随后的任何其他内容。通过这种方式,函数uu init_uu可以覆盖init_模块的工作