Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/heroku/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Tensorflow 基于序列长度信息的有效约简_-均值_Tensorflow - Fatal编程技术网

Tensorflow 基于序列长度信息的有效约简_-均值

Tensorflow 基于序列长度信息的有效约简_-均值,tensorflow,Tensorflow,我猜这种通用代码总是有效的: loss = tf.nn.sparse_softmax_cross_entropy_with_logits(out, labels) reduced_loss = tf.reduce_sum(tf.sequence_mask(seq_lens) * loss) / tf.sum(seq_lens) 有没有更有效的方法 我现在这样做: def flatten_with_seq_len_mask(x, seq_lens): """ :param tf.Tens

我猜这种通用代码总是有效的:

loss = tf.nn.sparse_softmax_cross_entropy_with_logits(out, labels)
reduced_loss = tf.reduce_sum(tf.sequence_mask(seq_lens) * loss) / tf.sum(seq_lens)
有没有更有效的方法

我现在这样做:

def flatten_with_seq_len_mask(x, seq_lens):
  """
  :param tf.Tensor x: shape (batch,time,...s...)
  :param tf.Tensor seq_lens: shape (batch,) of int64
  :return: tensor of shape (min(batch*time, sum(seq_len)), ...s...)
  :rtype: tf.Tensor
  """
  with tf.name_scope("flatten_with_seq_len_mask"):
    x = check_dim_equal(x, 0, seq_lens, 0)
    mask = tf.sequence_mask(seq_lens, maxlen=tf.shape(x)[1])  # shape (batch,time)
    return tf.boolean_mask(x, mask)

out_flat = flatten_with_seq_len_mask(out, seq_lens)
labels_flat = flatten_with_seq_len_mask(labels, seq_lens)
loss_flat = tf.nn.sparse_softmax_cross_entropy_with_logits(out_flat, labels_flat)
reduced_loss = tf.reduce_mean(loss_flat)