Numpy 如何从张量流分类模型中获得一系列预测

Numpy 如何从张量流分类模型中获得一系列预测,numpy,tensorflow,prediction,Numpy,Tensorflow,Prediction,我有以下分类模型。 我想得到一个类似于y_t的numpy数组,它是一个热编码的测试标签。然而,我不断得到变量错误 # Construct placeholders with graph.as_default(): inputs_ = tf.placeholder(tf.float32, [None, seq_len, n_channels], name = 'inputs') labels_ = tf.placeholder(tf.float32, [None, n_classe

我有以下分类模型。 我想得到一个类似于y_t的numpy数组,它是一个热编码的测试标签。然而,我不断得到变量错误

# Construct placeholders
with graph.as_default():
    inputs_ = tf.placeholder(tf.float32, [None, seq_len, n_channels], name = 'inputs')
    labels_ = tf.placeholder(tf.float32, [None, n_classes], name = 'labels')
    keep_prob_ = tf.placeholder(tf.float32, name = 'keep')
    learning_rate_ = tf.placeholder(tf.float32, name = 'learning_rate')

with graph.as_default():
    # (batch, 100, 3) --> (batch, 50, 6)
    conv1 = tf.layers.conv1d(inputs=inputs_, filters=6, kernel_size=2, strides=1, 
                             padding='same', activation = tf.nn.relu)
    max_pool_1 = tf.layers.max_pooling1d(inputs=conv1, pool_size=2, strides=2, padding='same')

with graph.as_default():
    # Flatten and add dropout
    flat = tf.reshape(max_pool_1, (-1, 6*6))
    flat = tf.nn.dropout(flat, keep_prob=keep_prob_)

    # Predictions
    logits = tf.layers.dense(flat, n_classes)

    # Cost function and optimizer
    cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=labels_))
    optimizer = tf.train.AdamOptimizer(learning_rate_).minimize(cost)

    # Accuracy
    correct_pred = tf.equal(tf.argmax(logits, 1), tf.argmax(labels_, 1))
    accuracy = tf.reduce_mean(tf.cast(correct_pred, tf.float32), name='accuracy')
然后我使用测试集

with tf.Session(graph=graph) as sess:
    # Restore
    saver.restore(sess, tf.train.latest_checkpoint('bschkpnt-cnn'))

    for x_t, y_t in get_batches(X_test, y_test, batch_size):
        feed = {inputs_: x_t,
                labels_: y_t,
                keep_prob_: 1}

        batch_acc = sess.run(accuracy, feed_dict=feed)
        test_acc.append(batch_acc)
    print("Test accuracy: {:.6f}".format(np.mean(test_acc)))
y_t是一个nX3凹凸阵列。 我想得到一个类似格式的y_pred

谢谢

soft=tf.nn.softmaxlogits

这将是你的概率分布,这样sumsoft=1。此数组中的每个值都将指示模型对类的确定程度

pred=sess.runsoft,feed\u dict=feed

printpred

所以基本上我所做的就是放置一个额外的softmax,因为它内置在你计算的损失中,你必须再次放置它来预测。然后我请求输出预测,然后再次输入feed_dict

希望这有帮助