Deep learning 目录迭代器';对象没有属性'_变型张量';

Deep learning 目录迭代器';对象没有属性'_变型张量';,deep-learning,Deep Learning,我正在通过tensorflow教程学习如何使用预训练模型,并遇到了错误 “DirectoryIterator'对象没有属性'\u variant\u tensor'” 我试图使用tf.data.experimental.cardinality(validation\u dataset)将验证集拆分为测试集和验证集,但最终出现以下错误: AttributeError Traceback (most recent call last) <i

我正在通过tensorflow教程学习如何使用预训练模型,并遇到了错误

“DirectoryIterator'对象没有属性'\u variant\u tensor'”

我试图使用
tf.data.experimental.cardinality(validation\u dataset)
将验证集拆分为测试集和验证集,但最终出现以下错误:

AttributeError                            Traceback (most recent call last)
<ipython-input-12-5818be180afb> in <module>
----> 1 val_batches = tf.data.experimental.cardinality(test_dataset)

~\AppData\Roaming\Python\Python37\site-packages\tensorflow\python\data\experimental\ops\cardinality.py in cardinality(dataset)
66   """
67 
---> 68   return gen_dataset_ops.dataset_cardinality(dataset._variant_tensor)  # pylint: disable=protected-access
 

AttributeError: 'DirectoryIterator' object has no attribute '_variant_tensor'

你好欢迎来到爱色。关于编程问题或bug的问题在这里通常是离题的。在这里,我们关注人工智能的理论、哲学和社会方面。有关更多详细信息,请参阅。我将把这篇文章迁移到Stack Overflow.Hello。欢迎来到爱色。关于编程问题或bug的问题在这里通常是离题的。在这里,我们关注人工智能的理论、哲学和社会方面。有关更多详细信息,请参阅。我将把这篇文章迁移到堆栈溢出。
import numpy as np
import os
import tensorflow as tf
from tensorflow.keras.preprocessing import image_dataset_from_directory
from keras.preprocessing.image import ImageDataGenerator

train_datagen = ImageDataGenerator(
        featurewise_center=True,
        samplewise_center=True,
        rescale=2./255)
training_dataset = train_datagen.flow_from_directory(
        'cats_and_dogs_filtered/cats_and_dogs_filtered/train',
        target_size=(160, 160),
        batch_size=32,
        class_mode='binary')

validation_datagen = ImageDataGenerator(
        featurewise_center=True,
        samplewise_center=True,
        rescale=2./255)
validation_dataset = validation_datagen.flow_from_directory(
        'cats_and_dogs_filtered/cats_and_dogs_filtered/train',
        target_size=(160, 160),
        batch_size=32,
        class_mode='binary')

val_batches = tf.data.experimental.cardinality(test_dataset)
test_dataset = validation_dataset.take(val_batches // 5)
validation_dataset = validation_dataset.skip(val_batches // 5)