Keras 检查语言模型的复杂性

Keras 检查语言模型的复杂性,keras,nlp,lstm,language-model,perplexity,Keras,Nlp,Lstm,Language Model,Perplexity,我用Keras LSTM创建了一个语言模型,现在我想评估它是否好,所以我想计算困惑 用Python计算模型复杂性的最佳方法是什么?我已经提出了两个版本并附上了相应的源代码,请随时查看链接 def perplexity_raw(y_true, y_pred): """ The perplexity metric. Why isn't this part of Keras yet?! https://stackoverflow.com/questions/41881308/h

我用Keras LSTM创建了一个语言模型,现在我想评估它是否好,所以我想计算困惑


用Python计算模型复杂性的最佳方法是什么?

我已经提出了两个版本并附上了相应的源代码,请随时查看链接

def perplexity_raw(y_true, y_pred):
    """
    The perplexity metric. Why isn't this part of Keras yet?!
    https://stackoverflow.com/questions/41881308/how-to-calculate-perplexity-of-rnn-in-tensorflow
    https://github.com/keras-team/keras/issues/8267
    """
#     cross_entropy = K.sparse_categorical_crossentropy(y_true, y_pred)
    cross_entropy = K.cast(K.equal(K.max(y_true, axis=-1),
                          K.cast(K.argmax(y_pred, axis=-1), K.floatx())),
                  K.floatx())
    perplexity = K.exp(cross_entropy)
    return perplexity

def perplexity(y_true, y_pred):
    """
    The perplexity metric. Why isn't this part of Keras yet?!
    https://stackoverflow.com/questions/41881308/how-to-calculate-perplexity-of-rnn-in-tensorflow
    https://github.com/keras-team/keras/issues/8267
    """
    cross_entropy = K.sparse_categorical_crossentropy(y_true, y_pred)
    perplexity = K.exp(cross_entropy)
    return perplexity