Keras 生成器应该传递什么来预测_generator()返回?

Keras 生成器应该传递什么来预测_generator()返回?,keras,Keras,我给Keras predict_generator()打电话如下: 瓶颈\u特征\u列车=模型。预测发电机(列车发电机、透镜(遥测)) 其中train\u gen()的定义如下 def列_gen(): # ... 收益率(X,y) 而X是具有形状(48299299,3)的numpy数组,y是具有形状(48,)的numpy数组 我得到下面的错误。我该怎么做呢 否则,链接到工作示例会有所帮助。我发现的唯一例子是Keras 1或使用ImageDataGenerator.flow() 我正在运行Kera

我给Keras predict_generator()打电话如下:

瓶颈\u特征\u列车=模型。预测发电机(列车发电机、透镜(遥测))

其中
train\u gen()
的定义如下

def列_gen():
# ...
收益率(X,y)

X
是具有形状(48299299,3)的numpy数组,
y
是具有形状(48,)的numpy数组

我得到下面的错误。我该怎么做呢

否则,链接到工作示例会有所帮助。我发现的唯一例子是Keras 1或使用
ImageDataGenerator.flow()

我正在运行Keras2.0.2

下面是错误:

Traceback (most recent call last):
  File "/home/fanta/workspace/CarND-Behavioral-Cloning-P3/cache.py", line 143, in <module>
    tf.app.run()
  File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/platform/app.py", line 44, in run
    _sys.exit(main(_sys.argv[:1] + flags_passthrough))
  File "/home/fanta/workspace/CarND-Behavioral-Cloning-P3/cache.py", line 138, in main
    bottleneck_features_train = model.predict_generator(train_gen, len(telemetry))
  File "/usr/local/lib/python3.5/dist-packages/keras/legacy/interfaces.py", line 88, in wrapper
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.5/dist-packages/keras/engine/training.py", line 2094, in predict_generator
    outs = self.predict_on_batch(x)
  File "/usr/local/lib/python3.5/dist-packages/keras/engine/training.py", line 1677, in predict_on_batch
    self._feed_input_shapes)
  File "/usr/local/lib/python3.5/dist-packages/keras/engine/training.py", line 100, in _standardize_input_data
    'Found: array with shape ' + str(data.shape))
ValueError: The model expects 0 input arrays, but only received one array. Found: array with shape (48, 299, 299, 3)

Process finished with exit code 1

在预测步骤中,生成器应该只生成输入,而不是目标。所以只有X,没有y


这有帮助吗?

似乎模型的预期输入与生成器生成的值之间存在不匹配。请添加代码,说明如何定义您的模型。我正在使用一个预优化的模型进行迁移学习,通过从Inception切换到VGG,它可以正常工作。现在我正努力让项目尽快运作;然后,我将把它归结为一小部分代码,仍然复制这个问题,并在这里跟进。我已经用一小部分代码更新了我的帖子,复制了这个问题。它似乎与Keras Inception预训练模型有关,与VGG一样,代码运行良好。
from keras.applications.inception_v3 import InceptionV3
from keras.applications.vgg16 import VGG16
from keras.layers import Input, AveragePooling2D
from keras.models import Model
from keras.datasets import cifar10
from scipy.misc import imresize
import pickle
import tensorflow as tf
import keras.backend as K
import numpy as np

network='inception'  # Must be 'inception' or 'vgg'
dataset='cifar10'
batch_size=64

if network == 'vgg':
    size = (224, 224)
elif network == 'inception':
    size = (299, 299)
else:
    assert False, "network must be either 'inception' or 'vgg'"

def create_model():
    input_tensor = Input(shape=(size[0], size[1], 3))
    if network == 'inception':
        model = InceptionV3(input_tensor=input_tensor, include_top=False)
        x = model.output
        x = AveragePooling2D((8, 8), strides=(8, 8))(x)
        model = Model(model.input, x)
    elif network == 'vgg':
        model = VGG16(input_tensor=input_tensor, include_top=False)
        x = model.output
        x = AveragePooling2D((7, 7))(x)
        model = Model(model.input, x)
    else:
        assert False
    return model

def main():

    # Download and load cifar10 dataset
    (X_train, y_train), (_, _) = cifar10.load_data()

    # Reduce the dataset to the first 1000 entries, to save memory and computation time
    X_train = X_train[0:1000]
    y_train = y_train[0:1000]

    # Resize dataset images to comply with expected input image size
    X_train = [imresize(image, size) for image in X_train]
    X_train = np.array(X_train)

    # File name where to save bottlenecked features
    train_output_file = "{}_{}_{}.p".format(network, dataset, 'bottleneck_features_train')
    print("Saving to", train_output_file)

    with tf.Session() as sess:
        K.set_session(sess)
        K.set_learning_phase(1)
        model = create_model()
        # We skip pre-processing and bottleneck the features
        bottleneck_features_train = model.predict(X_train, batch_size=batch_size, verbose=1)
        data = {'features': bottleneck_features_train, 'labels': y_train}
        pickle.dump(data, open(train_output_file, 'wb'))

if __name__ == '__main__':
    main()