Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/345.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 如何初始化pytorch中不同样式的nn.Sequential块的不同层的权重?_Python_Pytorch - Fatal编程技术网

Python 如何初始化pytorch中不同样式的nn.Sequential块的不同层的权重?

Python 如何初始化pytorch中不同样式的nn.Sequential块的不同层的权重?,python,pytorch,Python,Pytorch,假设我有一个nn.Sequential块,它有两个线性层。我希望通过均匀分布初始化第一层的权重,但希望将第二层的权重初始化为常数2.0 net=nn.Sequential() 净加模('Linear_1',nn.Linear(2,5,偏差=假)) 净加模('Linear_2',nn.Linear(5,5,bias=False) 以下是一种方法: import torch import torch.nn as nn net = nn.Sequential() ll1 = nn.Linear(

假设我有一个
nn.Sequential
块,它有两个线性层。我希望通过均匀分布初始化第一层的权重,但希望将第二层的权重初始化为常数2.0

net=nn.Sequential()
净加模('Linear_1',nn.Linear(2,5,偏差=假))
净加模('Linear_2',nn.Linear(5,5,bias=False)

以下是一种方法:

import torch
import torch.nn as nn 

net = nn.Sequential()

ll1 = nn.Linear(2, 5, bias = False)
torch.nn.init.uniform_(ll1.weight, a=0, b=1) # a: lower_bound, b: upper_bound
net.add_module('Linear_1', ll1)
print(ll1.weight)

ll2 = nn.Linear(5, 5, bias = False)
torch.nn.init.constant_(ll2.weight, 2.0)
net.add_module('Linear_2', ll2)
print(ll2.weight)

print(net)
输出:

Parameter containing:
tensor([[0.2549, 0.7823],
        [0.3439, 0.4721],
        [0.0709, 0.6447],
        [0.3969, 0.7849],
        [0.7631, 0.5465]], requires_grad=True)

Parameter containing:
tensor([[2., 2., 2., 2., 2.],
        [2., 2., 2., 2., 2.],
        [2., 2., 2., 2., 2.],
        [2., 2., 2., 2., 2.],
        [2., 2., 2., 2., 2.]], requires_grad=True)

Sequential(
(Linear_1): Linear(in_features=2, out_features=5, bias=False)
(Linear_2): Linear(in_features=5, out_features=5, bias=False)
)

定义模块后,您可以通过以下方式完成:

torch.nn.init.constant_(net.Linear_1.weight, 0.0)
torch.nn.init.xavier_normal_(net.Linear_2.weight)
用于使用不同类型的初始化初始化不同的层