Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/tensorflow/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
For loop 如何优化tensorflow中循环的内存使用?_For Loop_Tensorflow_Loss Function - Fatal编程技术网

For loop 如何优化tensorflow中循环的内存使用?

For loop 如何优化tensorflow中循环的内存使用?,for-loop,tensorflow,loss-function,For Loop,Tensorflow,Loss Function,以均方误差(MSE)为例,该函数一般定义如下: def exp_loss(batch_p, batch_t): loss_val = tf.reduce_mean(tf.squared_difference(batch_p, batch_t)) return loss_val 但当我使用for循环计算元素误差时,如下所示: def exp_loss_for(batch_p, batch_t): loss_val = 0 ns = int(batch_p.get_

以均方误差(MSE)为例,该函数一般定义如下:

def exp_loss(batch_p, batch_t):
    loss_val = tf.reduce_mean(tf.squared_difference(batch_p, batch_t))
    return loss_val
但当我使用for循环计算元素误差时,如下所示:

def exp_loss_for(batch_p, batch_t):
    loss_val = 0
    ns = int(batch_p.get_shape()[0])  # batch_size
    sl = int(batch_p.get_shape()[1])  # sequence_length
    nd = int(batch_p.get_shape()[2])  # num_dim
    for i in range(ns):
        for j in range(sl):
            for k in range(nd):
                loss_val += tf.square(tf.subtract(batch_p[i, j, k], batch_t[i, j, k]))
    loss_val = loss_val / (ns * sl * nd)
    return loss_val
在图形构造阶段,tensorflow将消耗太多内存。
如果我必须使用for循环自定义我自己的损失函数,比如
exp\u loss\u for
,但更复杂,有什么方法可以减少内存使用吗?

避免for循环。甚至更复杂的损失函数通常可以用tensorflow运算表示,而不使用for循环