Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/tensorflow/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/performance/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 形状不匹配:标签的形状(接收到的(128,))应与logit的形状相同,但最后一个尺寸除外(接收到的(16424))_Python_Tensorflow_Keras_Recurrent Neural Network - Fatal编程技术网

Python 形状不匹配:标签的形状(接收到的(128,))应与logit的形状相同,但最后一个尺寸除外(接收到的(16424))

Python 形状不匹配:标签的形状(接收到的(128,))应与logit的形状相同,但最后一个尺寸除外(接收到的(16424)),python,tensorflow,keras,recurrent-neural-network,Python,Tensorflow,Keras,Recurrent Neural Network,错误 ValueError:在转换的代码中: <ipython-input-63-1e3afece3370>:10 train_step * loss += loss_func(targ, logits) <ipython-input-43-44b2a8f6794e>:11 loss_func * loss_ = loss_object(real, pred) /usr/local/lib/python3.6/dist-packages/tensorf

错误 ValueError:在转换的代码中:

<ipython-input-63-1e3afece3370>:10 train_step  *
    loss += loss_func(targ, logits)
<ipython-input-43-44b2a8f6794e>:11 loss_func  *
    loss_ = loss_object(real, pred)
/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/keras/losses.py:124 __call__
    losses = self.call(y_true, y_pred)
/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/keras/losses.py:216 call
    return self.fn(y_true, y_pred, **self._fn_kwargs)
/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/keras/losses.py:973 sparse_categorical_crossentropy
    y_true, y_pred, from_logits=from_logits, axis=axis)
/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/keras/backend.py:4431 sparse_categorical_crossentropy
    labels=target, logits=output)
/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/ops/nn_ops.py:3477 sparse_softmax_cross_entropy_with_logits_v2
    labels=labels, logits=logits, name=name)
/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/ops/nn_ops.py:3393 sparse_softmax_cross_entropy_with_logits
    logits.get_shape()))

ValueError: Shape mismatch: The shape of labels (received (128,)) should equal the shape of logits except for the last dimension (received (16, 424)).
损失函数

    optimizer = tf.keras.optimizers.Adam()
    loss_object = tf.keras.losses.SparseCategoricalCrossentropy(
                     from_logits=True,
                     reduction='none')

    def loss_func(real, pred):
        mask = tf.math.logical_not(tf.math.equal(real, 0))
        loss_ = loss_object(real, pred)
        mask = tf.cast(mask, dtype=loss_.dtype)
        loss_ *= mask   
        return tf.reduce_mean(loss_)
火车站

    @tf.function
    def train_step(inp, targ):
        loss = 0
        with tf.GradientTape() as tape:
             logits = model(inp)
             loss += loss_func(targ, logits)

        variables = model.trainable_variables
        gradients = tape.gradient(loss, variables)
        optimizer.apply_gradients(zip(gradients, variables))
        return loss
输入:[135,144,0,0,0,0,0,0,0]


目标:[144127,0,0,0,0,0,0,0]

问题已解决。如果有人面临上述问题。请发表评论。我只是浏览了我网络中的形状,在我的LSTM层中发现了问题。我的2个神经元(softmax)的输出,错误是:形状不匹配:标签的形状(接收到的(8,))应该与Logit的形状相等,除了最后一个维度(接收到的(4,2))。解决方案?@datdinhquoc请分享一些代码?以下是代码,它抱怨形状:@datdinhquoc这段代码工作正常。我没有遇到任何类型的错误,请提供更多关于如何复制错误的详细信息。
    @tf.function
    def train_step(inp, targ):
        loss = 0
        with tf.GradientTape() as tape:
             logits = model(inp)
             loss += loss_func(targ, logits)

        variables = model.trainable_variables
        gradients = tape.gradient(loss, variables)
        optimizer.apply_gradients(zip(gradients, variables))
        return loss