Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/tensorflow/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 为什么我的回调没有在Tensorflow中调用?_Python_Tensorflow_Machine Learning_Keras_Deep Learning - Fatal编程技术网

Python 为什么我的回调没有在Tensorflow中调用?

Python 为什么我的回调没有在Tensorflow中调用?,python,tensorflow,machine-learning,keras,deep-learning,Python,Tensorflow,Machine Learning,Keras,Deep Learning,下面是我的Tensorflow和Python代码,它将在使用回调函数的准确率达到99%时结束培训。但是回调没有调用。问题在哪里 def train_mnist(): class myCallback(tf.keras.callbacks.Callback): def on_epoc_end(self, epoch,logs={}): if (logs.get('accuracy')>0.99): print(

下面是我的Tensorflow和Python代码,它将在使用回调函数的准确率达到99%时结束培训。但是回调没有调用。问题在哪里

def train_mnist():

    class myCallback(tf.keras.callbacks.Callback):
        def on_epoc_end(self, epoch,logs={}):
            if (logs.get('accuracy')>0.99):
                print("Reached 99% accuracy so cancelling training!")
                self.model.stop_training=True



    mnist = tf.keras.datasets.mnist

    (x_train, y_train),(x_test, y_test) = mnist.load_data(path=path)

    x_train= x_train/255.0
    x_test= x_test/255.0
    callbacks=myCallback()

    model = tf.keras.models.Sequential([
        # YOUR CODE SHOULD START HERE
        tf.keras.layers.Flatten(input_shape=(28, 28)),
        tf.keras.layers.Dense(256, activation=tf.nn.relu),
        tf.keras.layers.Dense(10, activation=tf.nn.softmax)

    ])

    model.compile(optimizer='adam',
                  loss='sparse_categorical_crossentropy',
                  metrics=['accuracy'])

    # model fitting
    history = model.fit(x_train,y_train, epochs=10,callbacks=[callbacks]) 
    # model fitting
    return history.epoch, history.history['acc'][-1]

您拼错了epoch,并且您应该返回
准确性
而不是
acc

from tensorflow.keras.layers import Input, Dense, Add, Activation, Flatten
from tensorflow.keras.models import Model, Sequential
import tensorflow as tf
import numpy as np
import random


from tensorflow.python.keras.layers import Input, GaussianNoise, BatchNormalization

def train_mnist():
  class myCallback(tf.keras.callbacks.Callback):
      def on_epoch_end(self, epoch,logs={}):
          print(logs.get('accuracy'))
          if (logs.get('accuracy')>0.9):
              print("Reached 90% accuracy so cancelling training!")
              self.model.stop_training=True



  mnist = tf.keras.datasets.mnist

  (x_train, y_train),(x_test, y_test) = mnist.load_data()

  x_train= x_train/255.0
  x_test= x_test/255.0

  callbacks=myCallback()

  model = tf.keras.models.Sequential([
      # YOUR CODE SHOULD START HERE
      tf.keras.layers.Flatten(input_shape=(28, 28)),
      tf.keras.layers.Dense(256, activation=tf.nn.relu),
      tf.keras.layers.Dense(10, activation=tf.nn.softmax)

  ])

  model.compile(optimizer='adam',
                loss='sparse_categorical_crossentropy',
                metrics=['accuracy'])

  # model fitting
  history = model.fit(x_train,y_train, epochs=10,callbacks=[callbacks]) 
  # model fitting
  return history.epoch, history.history['accuracy'][-1]

train_mnist()




您拼错了epoch,并且您应该返回
准确性
而不是
acc

from tensorflow.keras.layers import Input, Dense, Add, Activation, Flatten
from tensorflow.keras.models import Model, Sequential
import tensorflow as tf
import numpy as np
import random


from tensorflow.python.keras.layers import Input, GaussianNoise, BatchNormalization

def train_mnist():
  class myCallback(tf.keras.callbacks.Callback):
      def on_epoch_end(self, epoch,logs={}):
          print(logs.get('accuracy'))
          if (logs.get('accuracy')>0.9):
              print("Reached 90% accuracy so cancelling training!")
              self.model.stop_training=True



  mnist = tf.keras.datasets.mnist

  (x_train, y_train),(x_test, y_test) = mnist.load_data()

  x_train= x_train/255.0
  x_test= x_test/255.0

  callbacks=myCallback()

  model = tf.keras.models.Sequential([
      # YOUR CODE SHOULD START HERE
      tf.keras.layers.Flatten(input_shape=(28, 28)),
      tf.keras.layers.Dense(256, activation=tf.nn.relu),
      tf.keras.layers.Dense(10, activation=tf.nn.softmax)

  ])

  model.compile(optimizer='adam',
                loss='sparse_categorical_crossentropy',
                metrics=['accuracy'])

  # model fitting
  history = model.fit(x_train,y_train, epochs=10,callbacks=[callbacks]) 
  # model fitting
  return history.epoch, history.history['accuracy'][-1]

train_mnist()




不幸的是,我没有足够的声誉对上面的评论提供评论,但我想指出,on_epoch_end函数是在一个epoch结束时通过tensorflow直接调用的。在本例中,我们只是在一个自定义python类中实现它,该类将由底层框架自动调用。我从Tensorflow实践深度学习中获取信息。课程第2周。与上述回调问题的根源似乎非常相似

以下是我最近一次跑步的一些证据:

Epoch 1/20
59968/60000 [============================>.] - ETA: 0s - loss: 1.0648 - acc: 0.9491Inside callback
60000/60000 [==============================] - 34s 575us/sample - loss: 1.0645 - acc: 0.9491
Epoch 2/20
59968/60000 [============================>.] - ETA: 0s - loss: 0.0560 - acc: 0.9825Inside callback
60000/60000 [==============================] - 35s 583us/sample - loss: 0.0560 - acc: 0.9825
Epoch 3/20
59840/60000 [============================>.] - ETA: 0s - loss: 0.0457 - acc: 0.9861Inside callback
60000/60000 [==============================] - 31s 512us/sample - loss: 0.0457 - acc: 0.9861
Epoch 4/20
59840/60000 [============================>.] - ETA: 0s - loss: 0.0428 - acc: 0.9873Inside callback
60000/60000 [==============================] - 32s 528us/sample - loss: 0.0428 - acc: 0.9873
Epoch 5/20
59808/60000 [============================>.] - ETA: 0s - loss: 0.0314 - acc: 0.9909Inside callback
60000/60000 [==============================] - 30s 507us/sample - loss: 0.0315 - acc: 0.9909
Epoch 6/20
59840/60000 [============================>.] - ETA: 0s - loss: 0.0271 - acc: 0.9924Inside callback
60000/60000 [==============================] - 32s 532us/sample - loss: 0.0270 - acc: 0.9924
Epoch 7/20
59968/60000 [============================>.] - ETA: 0s - loss: 0.0238 - acc: 0.9938Inside callback
60000/60000 [==============================] - 33s 555us/sample - loss: 0.0238 - acc: 0.9938
Epoch 8/20
59936/60000 [============================>.] - ETA: 0s - loss: 0.0255 - acc: 0.9934Inside callback
60000/60000 [==============================] - 33s 550us/sample - loss: 0.0255 - acc: 0.9934
Epoch 9/20
59872/60000 [============================>.] - ETA: 0s - loss: 0.0195 - acc: 0.9953Inside callback
60000/60000 [==============================] - 33s 557us/sample - loss: 0.0194 - acc: 0.9953
Epoch 10/20
59744/60000 [============================>.] - ETA: 0s - loss: 0.0186 - acc: 0.9959Inside callback
60000/60000 [==============================] - 33s 551us/sample - loss: 0.0185 - acc: 0.9959
Epoch 11/20
59968/60000 [============================>.] - ETA: 0s - loss: 0.0219 - acc: 0.9954Inside callback
60000/60000 [==============================] - 32s 530us/sample - loss: 0.0219 - acc: 0.9954
Epoch 12/20
59936/60000 [============================>.] - ETA: 0s - loss: 0.0208 - acc: 0.9960Inside callback
60000/60000 [==============================] - 33s 558us/sample - loss: 0.0208 - acc: 0.9960
Epoch 13/20
59872/60000 [============================>.] - ETA: 0s - loss: 0.0185 - acc: 0.9968Inside callback
60000/60000 [==============================] - 31s 520us/sample - loss: 0.0184 - acc: 0.9968
Epoch 14/20
59872/60000 [============================>.] - ETA: 0s - loss: 0.0181 - acc: 0.9970Inside callback
60000/60000 [==============================] - 35s 587us/sample - loss: 0.0181 - acc: 0.9970
Epoch 15/20
59936/60000 [============================>.] - ETA: 0s - loss: 0.0193 - acc: 0.9971Inside callback
60000/60000 [==============================] - 33s 555us/sample - loss: 0.0192 - acc: 0.9972
Epoch 16/20
59968/60000 [============================>.] - ETA: 0s - loss: 0.0176 - acc: 0.9972Inside callback
60000/60000 [==============================] - 33s 558us/sample - loss: 0.0176 - acc: 0.9972
Epoch 17/20
59968/60000 [============================>.] - ETA: 0s - loss: 0.0183 - acc: 0.9974Inside callback
60000/60000 [==============================] - 33s 555us/sample - loss: 0.0182 - acc: 0.9974
Epoch 18/20
59872/60000 [============================>.] - ETA: 0s - loss: 0.0225 - acc: 0.9970Inside callback
60000/60000 [==============================] - 34s 570us/sample - loss: 0.0224 - acc: 0.9970
Epoch 19/20
59808/60000 [============================>.] - ETA: 0s - loss: 0.0185 - acc: 0.9975Inside callback
60000/60000 [==============================] - 33s 548us/sample - loss: 0.0185 - acc: 0.9975
Epoch 20/20
59776/60000 [============================>.] - ETA: 0s - loss: 0.0150 - acc: 0.9979Inside callback
60000/60000 [==============================] - 34s 565us/sample - loss: 0.0149 - acc: 0.9979
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
<ipython-input-25-1ff3c304aec3> in <module>
----> 1 _, _ = train_mnist_conv()

<ipython-input-24-b469df35dac0> in train_mnist_conv()
     38     )
     39     # model fitting
---> 40     return history.epoch, history.history['accuracy'][-1]
     41 

KeyError: 'accuracy'
纪元1/20
59968/60000[=========================>。]-预计到达时间:0s-损失:1.0648-附件:0.9491内部回调
60000/60000[============================================================]34s 575us/样品-损耗:1.0645-附件:0.9491
纪元2/20
59968/60000[=========================>。]-预计到达时间:0s-损失:0.0560-呼叫中心:0.9825内部回拨
60000/60000[=============================================]35秒583us/样本-损耗:0.0560-附件:0.9825
纪元3/20
59840/60000[=========================>。]-预计到达时间:0s-损失:0.0457-附件:0.9861内部回调
60000/60000[=======================================]-31s 512us/样品-损耗:0.0457-附件:0.9861
纪元4/20
59840/60000[=========================>。]-预计到达时间:0s-损失:0.0428-附件:0.9873内部回调
60000/60000[==========================================]32s 528us/样品-损耗:0.0428-附件:0.9873
纪元5/20
59808/60000[=========================>。]-预计到达时间:0s-损失:0.0314-附件:0.9909内部回调
60000/60000[====================================]-30s 507us/样品-损耗:0.0315-附件:0.9909
纪元6/20
59840/60000[============================>。]-预计到达时间:0s-损失:0.0271-acc:0.9924内部回调
60000/60000[====================================]-32s 532us/样品-损耗:0.0270-附件:0.9924
纪元7/20
59968/60000[=========================>。]-预计到达时间:0s-损失:0.0238-呼叫中心:0.9938
60000/60000[=======================================]33s 555us/样品-损耗:0.0238-附件:0.9938
纪元8/20
59936/60000[=========================>。]-预计到达时间:0s-损失:0.0255-呼叫中心:0.9934内部回拨
60000/60000[==========================================]33s 550us/样品-损耗:0.0255-附件:0.9934
纪元9/20
59872/60000[=========================>。]-预计到达时间:0s-损失:0.0195-附件:0.9953内部回调
60000/60000[=======================================]33s 557us/样品-损耗:0.0194-附件:0.9953
纪元10/20
59744/60000[=========================>。]-预计到达时间:0s-损失:0.0186-附件:0.9959内部回调
60000/60000[=======================================]33s 551U/样本-损耗:0.0185-附件:0.9959
纪元11/20
59968/60000[=========================>。]-预计到达时间:0s-损失:0.0219-附件:0.9954内部回调
60000/60000[=======================================]-32s 530us/样品-损耗:0.0219-附件:0.9954
纪元12/20
59936/60000[=========================>。]-预计到达时间:0s-损失:0.0208-acc:0.9960内部回调
60000/60000[=======================================]33s 558us/样品-损耗:0.0208-附件:0.9960
纪元13/20
59872/60000[=========================>。]-预计到达时间:0s-损失:0.0185-附件:0.9968内部回调
60000/60000[==========================================]31s 520us/样品-损耗:0.0184-附件:0.9968
纪元14/20
59872/60000[=========================>。]-预计到达时间:0s-损失:0.0181-acc:0.9970内部回调
60000/60000[==========================================]35秒587us/样品-损耗:0.0181-附件:0.9970
纪元15/20
59936/60000[=========================>。]-预计到达时间:0s-损失:0.0193-附件:0.9971内部回调
60000/60000[=======================================]33s 555us/样品-损耗:0.0192-附件:0.9972
纪元16/20
59968/60000[=========================>。]-预计到达时间:0s-损失:0.0176-acc:0.9972内部回调
60000/60000[=======================================]33s 558us/样品-损耗:0.0176-附件:0.9972
纪元17/20
59968/60000[=========================>。]-预计到达时间:0s-损失:0.0183-附件:0.9974内部回调
60000/60000[=======================================]33s 555us/样品-损耗:0.0182-附件:0.9974
纪元18/20
59872/60000[=========================>。]-预计到达时间:0s-损失:0.0225-acc:0.9970内部回调
60000/60000[================================================]34s 570us/样品-损耗:0.0224-附件:0.9970
纪元19/20
59808/60000[=========================>。]-预计到达时间:0s-损失:0.0185-acc:0.9975内部回调
60000/60000[=======================================]33s 548us/样品-损耗:0.0185-附件:0.9975
纪元20/20
59776/60000[=========================>。]-预计到达时间:0s-损失:0.0150-acc:0.9999内部回调
60000/60000[===================================================]34s 565us/样品-损耗:0.0149-附件:0.9979
---------------------------------------------------------------------------
KeyError回溯(最近一次呼叫最后一次)
在里面
---->1 uu,uu=train\u mnist\u conv()
列车上
38     )
39#模型装配
--->40返回history.epoch,history.history['accurity'][-1]
41
关键错误:“准确性”

关键错误是因为历史对象没有“准确性”关键字,所以我想在继续之前将其作为一个问题加以解决。

不幸的是,我没有足够的声誉对上述评论之一进行评论,但我想指出的是,on_epoch_end函数是在一个epoch结束时通过tensorflow直接调用的。在本例中,我们只是在一个自定义python类中实现它,该类将由底层框架自动调用。我是从Tensorflow公司采购的