Tensorflow tf.train.shuffle\u批处理如何工作?

Tensorflow tf.train.shuffle\u批处理如何工作?,tensorflow,Tensorflow,它是在一个时代进行一次洗牌,还是其他 tf.train.shuffle_batch和tf.train.batch有什么区别 有人能解释一下吗?谢谢。首先看一下文档(和)。内部批处理是围绕FIFOQueue构建的,而shuffle_批处理是围绕randomsufflequeue构建的 考虑下面的玩具示例,它将1到100放入一个常数中,该常数通过tf.train.shuffle_batch和tf.train.batch输入,然后打印结果 import tensorflow as tf import

它是在一个时代进行一次洗牌,还是其他

tf.train.shuffle_batch和tf.train.batch有什么区别


有人能解释一下吗?谢谢。

首先看一下文档(和)。内部批处理是围绕FIFOQueue构建的,而shuffle_批处理是围绕randomsufflequeue构建的

考虑下面的玩具示例,它将1到100放入一个常数中,该常数通过tf.train.shuffle_batch和tf.train.batch输入,然后打印结果

import tensorflow as tf
import numpy as np

data = np.arange(1, 100 + 1)
data_input = tf.constant(data)

batch_shuffle = tf.train.shuffle_batch([data_input], enqueue_many=True, batch_size=10, capacity=100, min_after_dequeue=10, allow_smaller_final_batch=True)
batch_no_shuffle = tf.train.batch([data_input], enqueue_many=True, batch_size=10, capacity=100, allow_smaller_final_batch=True)

with tf.Session() as sess:
    coord = tf.train.Coordinator()
    threads = tf.train.start_queue_runners(coord=coord)
    for i in range(10):
        print(i, sess.run([batch_shuffle, batch_no_shuffle]))
    coord.request_stop()
    coord.join(threads)
这将产生:

0 [array([23, 48, 15, 46, 78, 89, 18, 37, 88,  4]), array([ 1,  2,  3,  4,  5,  6,  7,  8,  9, 10])]
1 [array([80, 10,  5, 76, 50, 53,  1, 72, 67, 14]), array([11, 12, 13, 14, 15, 16, 17, 18, 19, 20])]
2 [array([11, 85, 56, 21, 86, 12,  9,  7, 24,  1]), array([21, 22, 23, 24, 25, 26, 27, 28, 29, 30])]
3 [array([ 8, 79, 90, 81, 71,  2, 20, 63, 73, 26]), array([31, 32, 33, 34, 35, 36, 37, 38, 39, 40])]
4 [array([84, 82, 33,  6, 39,  6, 25, 19, 19, 34]), array([41, 42, 43, 44, 45, 46, 47, 48, 49, 50])]
5 [array([27, 41, 21, 37, 60, 16, 12, 16, 24, 57]), array([51, 52, 53, 54, 55, 56, 57, 58, 59, 60])]
6 [array([69, 40, 52, 55, 29, 15, 45,  4,  7, 42]), array([61, 62, 63, 64, 65, 66, 67, 68, 69, 70])]
7 [array([61, 30, 53, 95, 22, 33, 10, 34, 41, 13]), array([71, 72, 73, 74, 75, 76, 77, 78, 79, 80])]
8 [array([45, 52, 57, 35, 70, 51,  8, 94, 68, 47]), array([81, 82, 83, 84, 85, 86, 87, 88, 89, 90])]
9 [array([35, 28, 83, 65, 80, 84, 71, 72, 26, 77]), array([ 91,  92,  93,  94,  95,  96,  97,  98,  99, 100])]

tf.train.shuffle\u batch()每一个历元都会洗牌。

我投票结束这个问题,因为我甚至没有尝试阅读文档。请明确。例如,它是否在不更改的情况下洗牌每个批中的内容?它是否在批次中洗牌所有项目?它会改变批次的顺序吗?注意,正如您提到的“读取文档”一样,它现在被列为已弃用,建议现在使用
tf.data.Dataset.shuffle(出列后的最小值)。批次(批次大小)