Python 从神经网络层去除偏差

Python 从神经网络层去除偏差,python,lasagne,nolearn,Python,Lasagne,Nolearn,我想删除bias参数。我试图在我定义神经网络的地方加入bias=None,但没有成功 net1 = NeuralNet( layers=[ # three layers: one hidden layer ('input', layers.InputLayer), #('hidden', layers.DenseLayer), ('output', layers.DenseLayer), ], # layer parameters: input_shape=(None,2), # 2 input

我想删除bias参数。我试图在我定义神经网络的地方加入bias=None,但没有成功

net1 = NeuralNet(
layers=[ # three layers: one hidden layer
('input', layers.InputLayer),
#('hidden', layers.DenseLayer),
('output', layers.DenseLayer),
],
# layer parameters:
input_shape=(None,2), # 2 inputs
#hidden_num_units=200, # number of units in hidden layer
output_nonlinearity=None, # output layer uses identity function
output_num_units=1, # 1 target value

# optimization method:
update=nesterov_momentum,
update_learning_rate=0.01,
update_momentum=0.9,

regression=True,  # flag to indicate we're dealing with regression problem
max_epochs=400,  # we want to train this many epochs
verbose=1,
bias = None
) 
根据(对于密集层类似),对于偏移,您有以下选项:

b = None 
至少根据千层面文档,似乎没有任何一层的“偏差”参数,而是使用“b”。我不能代表NoLearn说话,因为我不使用该软件包

# Build the network yourself
inputs = InputLayer(shape=(None, 2))
network = DenseLayer(inputs, num_units=1, nonlinearity=None, b = None)

net1 = NeuralNet(
network,
#We don't need any of these parameters since we provided them above
# layer parameters:
#input_shape=(None,2), # 2 inputs
#hidden_num_units=200, # number of units in hidden layer
#output_nonlinearity=None, # output layer uses identity function
#output_num_units=1, # 1 target value

# optimization method:
update=nesterov_momentum,
update_learning_rate=0.01,
update_momentum=0.9,

regression=True,  # flag to indicate we're dealing with regression problem
max_epochs=400,  # we want to train this many epochs
verbose=1,
bias = None
) 
编辑:

以下是一些千层面示例代码:

import lasagne
net = {}
net['input'] = lasagne.layers.InputLayer(shape=(None, 3, 224,224), input_var=None)
net['conv'] = lasagne.layers.Conv2DLayer(net['input'], num_filters=5, filter_size=3, b = None)
print net['conv'].get_params()
返回:

[W]
孤独,意味着没有偏见

对于NoLearn,我不确定,因为我不使用该软件包

# Build the network yourself
inputs = InputLayer(shape=(None, 2))
network = DenseLayer(inputs, num_units=1, nonlinearity=None, b = None)

net1 = NeuralNet(
network,
#We don't need any of these parameters since we provided them above
# layer parameters:
#input_shape=(None,2), # 2 inputs
#hidden_num_units=200, # number of units in hidden layer
#output_nonlinearity=None, # output layer uses identity function
#output_num_units=1, # 1 target value

# optimization method:
update=nesterov_momentum,
update_learning_rate=0.01,
update_momentum=0.9,

regression=True,  # flag to indicate we're dealing with regression problem
max_epochs=400,  # we want to train this many epochs
verbose=1,
bias = None
) 

我认为这应该行得通。网络中可能有一个kwarg要传递(我不记得了),但我认为默认情况下,如果没有给出任何参数,它是第一个参数

当使用“b=None”时,我得到了“ValueError:Unused kwarg:b”,我已经用一些烤宽面条示例代码更新了我的答案。我不使用NoLearn。我提供了一个千层面的答案,因为你的问题中包含了千层面标签。希望这对你仍然有用。我记得我在哪里看到这个。。。在源代码中:):