Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/python-3.x/15.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 3.x 在tensorflow中通过同一前馈网络传递多个值_Python 3.x_Tensorflow_Machine Learning_Neural Network - Fatal编程技术网

Python 3.x 在tensorflow中通过同一前馈网络传递多个值

Python 3.x 在tensorflow中通过同一前馈网络传递多个值,python-3.x,tensorflow,machine-learning,neural-network,Python 3.x,Tensorflow,Machine Learning,Neural Network,我试图一次在同一网络中传递3个值,因为我需要所有3个向量的值来计算三重态损耗。但是当我传递第二个值时,它会给出一个错误 代码片段是: # runs the siamese network def forward_prop(x): w1 = tf.get_variable("w1", [n1, 2048], initializer=tf.contrib.layers.xavier_initializer()) * 0.01 b1 = tf.get_variable("b1", [n

我试图一次在同一网络中传递3个值,因为我需要所有3个向量的值来计算三重态损耗。但是当我传递第二个值时,它会给出一个错误

代码片段是:

# runs the siamese network
def forward_prop(x):
    w1 = tf.get_variable("w1", [n1, 2048], initializer=tf.contrib.layers.xavier_initializer()) * 0.01
    b1 = tf.get_variable("b1", [n1, 1], initializer=tf.zeros_initializer())*0.01
    z1 = tf.add(tf.matmul(w1, x), b1)   # n1*2048 x 2048*batch_size = n1*batch_size
    a1 = tf.nn.relu(z1)    # n1*batch_size

    w2 = tf.get_variable("w2", [n2, n1], initializer=tf.contrib.layers.xavier_initializer()) * 0.01
    b2 = tf.get_variable("b2", [n2, 1], initializer=tf.zeros_initializer()) * 0.01
    z2 = tf.add(tf.matmul(w2, a1), b2)   # n2*n1 x n1*batch_size = n2*batch_size
    a2 = tf.nn.relu(z2)    # n2*batch_size

    w3 = tf.get_variable("w3", [n3, n2], initializer=tf.contrib.layers.xavier_initializer()) * 0.01
    b3 = tf.get_variable("b3", [n3, 1], initializer=tf.zeros_initializer()) * 0.01
    z3 = tf.add(tf.matmul(w3, a2), b3)   # n3*n2 x n2*batch_size = n3*batch_size
    a3 = tf.nn.relu(z3)    # n3*batch_size

    w4 = tf.get_variable("w4", [n4, n3], initializer=tf.contrib.layers.xavier_initializer()) * 0.01
    b4 = tf.get_variable("b4", [n4, 1], initializer=tf.zeros_initializer()) * 0.01
    z4 = tf.add(tf.matmul(w4, a3), b4)   # n4*n3 x n3*batch_size = n4*batch_size
    a4 = tf.nn.relu(z4)    # n4*batch_size = 128*batch_size (128 feature vectors for all training examples)

    return a4

def back_prop():
    anchor_embeddings = forward_prop(x1)
    positive_embeddings = forward_prop(x2)
    negative_embeddings = forward_prop(x3)

    # finding sum of squares of distances
    distance_positive = tf.reduce_sum(tf.square(anchor_embeddings - positive_embeddings), 0)
    distance_negative = tf.reduce_sum(tf.square(anchor_embeddings - negative_embeddings), 0)

    # applying the triplet loss equation
    triplet_loss = tf.maximum(0., distance_positive - distance_negative + margin)
    triplet_loss = tf.reduce_mean(triplet_loss)
    optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate).minimize(triplet_loss)

    with tf.Session as sess:
        sess.run(tf.global_variables_initializer())

        feed_dict = {
        x1: anchors,
        x2: positives,
        x3: negatives
        }

        print("Starting the Siamese network...")
        for epoch in range(total_epochs_net_1):
        for _ in range(len(anchors)):
            _, triplet_loss = sess.run([optimizer, triplet_loss], feed_dict=feed_dict)

        print("Epoch", epoch, "completed out of", total_epochs_net_1)

        saver = tf.train.Saver()
        saver.save(sess, 'face_recognition_model')
我在以下行中遇到错误:

positive_embeddings = forward_prop(x2)
anchor_embeddings = forward_prop(x1)
forward_prop()函数中的tf.get_变量抛出错误

错误显示:

ValueError: Variable w1 already exists, disallowed. Did you mean to set reuse=True or reuse=tf.AUTO_REUSE in VarScope?
我认为这是因为变量w1是在下一行forward_prop()函数的第一次调用中定义的:

positive_embeddings = forward_prop(x2)
anchor_embeddings = forward_prop(x1)

如何解决这个问题?我无法分别传递这三个值,因为我需要所有这三个值来计算三重态损耗。任何帮助都将不胜感激。谢谢

您在此处错误配置了您的网络:

def back_prop():
    anchor_embeddings = forward_prop(x1)
    positive_embeddings = forward_prop(x2)
    negative_embeddings = forward_prop(x3)
你应该只定义一个网络,你错误地为3个输入中的每一个定义了3组变量,实际上这里定义了3个神经网络

对于三重态丢失,您要做的是将3个输入作为一个批输入到单个网络(所有3个输入都由同一个网络处理),而不是作为单个变量。在本次讨论中,我将假设您的输入是图像,并且您在每个培训步骤中都在一组3个输入上进行培训

如果图像的大小为256x256x1(灰度),则单个三重组批次的形状为[3 x 256 x 256 x 1]。现在您的输出将是形状[3 x您的输出层大小]。你的损失函数现在应该理解第一个轴代表你的3个值:锚定、正、负。适当计算损失

当然,你可以传递多个锚,积极的,消极的,你只需要在损失函数中处理更复杂的细节,这是完全可行的。我的三重态损耗函数变得相当复杂,所以我建议保持简单的开始