Python 更改NN中的单个权重

Python 更改NN中的单个权重,python,tensorflow,keras,Python,Tensorflow,Keras,我正在做一个(有点愚蠢的)实验,想手动去激活网络中的某些神经元。从我所读到的最好的方法是使用面罩,或者调整重量。对于后者,我可以打印单个神经元的值,但现在我想“设置”它。问题是我不能说张量=0.0,因为它也包含形状和类型。有什么想法吗?您可以使用“分配”更改值,如下所示: import tensorflow as tf from tensorflow import keras import numpy as np model = keras.Sequential([ keras.lay

我正在做一个(有点愚蠢的)实验,想手动去激活网络中的某些神经元。从我所读到的最好的方法是使用面罩,或者调整重量。对于后者,我可以打印单个神经元的值,但现在我想“设置”它。问题是我不能说张量=0.0,因为它也包含形状和类型。有什么想法吗?

您可以使用“分配”更改值,如下所示:

import tensorflow as tf
from tensorflow import keras
import numpy as np

model = keras.Sequential([
    keras.layers.Flatten(input_shape=(2,)),
    keras.layers.Dense(20, activation=tf.nn.relu),
    keras.layers.Dense(20, activation=tf.nn.relu),
    keras.layers.Dense(1)
])


# an individual weight is like:
print(model.weights[4][0])
# returns tf.Tensor([0.3985532], shape=(1,), dtype=float32)
import tensorflow as tf
from tensorflow import keras
import numpy as np

model = keras.Sequential([
    keras.layers.Flatten(input_shape=(2,)),
    keras.layers.Dense(20, activation=tf.nn.relu),
    keras.layers.Dense(20, activation=tf.nn.relu),
    keras.layers.Dense(1)
])


# an individual weight is like:
print(model.weights[4][0])

weights=model.weights[4].numpy() # get the kernel weights of the layer as numpy array
weights[0]=0 #set tensor model.weights[4][0] to zero
model.weights[4].assign(weights)
print(model.weights[4][0]) # displays tf.Tensor([0.], shape=(1,), dtype=float32)
您还可以使用“分配”和“张量分散和更新”,如下所示:

import tensorflow as tf
from tensorflow import keras
import numpy as np

model = keras.Sequential([
    keras.layers.Flatten(input_shape=(2,)),
    keras.layers.Dense(20, activation=tf.nn.relu),
    keras.layers.Dense(20, activation=tf.nn.relu),
    keras.layers.Dense(1)
])


# an individual weight is like:
print(model.weights[4][0])
# returns tf.Tensor([0.3985532], shape=(1,), dtype=float32)
import tensorflow as tf
from tensorflow import keras
import numpy as np

model = keras.Sequential([
    keras.layers.Flatten(input_shape=(2,)),
    keras.layers.Dense(20, activation=tf.nn.relu),
    keras.layers.Dense(20, activation=tf.nn.relu),
    keras.layers.Dense(1)
])


# an individual weight is like:
print(model.weights[4][0])

weights=model.weights[4].numpy() # get the kernel weights of the layer as numpy array
weights[0]=0 #set tensor model.weights[4][0] to zero
model.weights[4].assign(weights)
print(model.weights[4][0]) # displays tf.Tensor([0.], shape=(1,), dtype=float32)

您可以通过这种方式将权重获取为numpy数组,并在该状态下修改它们,然后重新设置weightsamazing,这正是我想要的。非常感谢您抽出时间!