Python 将手工制作的功能添加到Keras顺序模型

Python 将手工制作的功能添加到Keras顺序模型,python,tensorflow,machine-learning,deep-learning,keras,Python,Tensorflow,Machine Learning,Deep Learning,Keras,我有1D序列,我想将其用作KerasVGG分类模型的输入,在x\U序列中拆分和x\U测试。对于每个序列,我还将自定义功能存储在feats\u train和feats\u test中,我不想将其输入到卷积层,而是输入到第一个完全连接的层 因此,一个完整的序列或测试样本将由1D序列加上n个浮点特征组成 首先将自定义功能提供给完全连接的层的最佳方式是什么?我曾考虑将输入序列和自定义功能连接起来,但我不知道如何在模型内部将它们分开。还有其他选择吗 没有自定义功能的代码: x_train, x_test,

我有1D序列,我想将其用作Keras
VGG
分类模型的输入,在
x\U序列中拆分
x\U测试
。对于每个序列,我还将自定义功能存储在
feats\u train
feats\u test
中,我不想将其输入到卷积层,而是输入到第一个完全连接的层

因此,一个完整的序列或测试样本将由1D序列加上n个浮点特征组成

首先将自定义功能提供给完全连接的层的最佳方式是什么?我曾考虑将输入序列和自定义功能连接起来,但我不知道如何在模型内部将它们分开。还有其他选择吗

没有自定义功能的代码:

x_train, x_test, y_train, y_test, feats_train, feats_test = load_balanced_datasets()

model = Sequential()
model.add(Conv1D(10, 5, activation='relu', input_shape=(timesteps, 1)))
model.add(Conv1D(10, 5, activation='relu'))
model.add(MaxPooling1D(pool_size=2))
model.add(Dropout(0.5, seed=789))

model.add(Conv1D(5, 6, activation='relu'))
model.add(Conv1D(5, 6, activation='relu'))
model.add(MaxPooling1D(pool_size=2))
model.add(Dropout(0.5, seed=789))

model.add(Flatten())

model.add(Dense(512, activation='relu'))
model.add(Dropout(0.5, seed=789))
model.add(Dense(2, activation='softmax'))

model.compile(loss='logcosh', optimizer='adam', metrics=['accuracy'])

model.fit(x_train, y_train, batch_size=batch_size, epochs=20, shuffle=False, verbose=1)

y_pred = model.predict(x_test)

顺序
模型不是很灵活。你应该调查这件事

我想试试这样的东西:

from keras.layers import (Conv1D, MaxPool1D, Dropout, Flatten, Dense,
                          Input, concatenate)
from keras.models import Model, Sequential

timesteps = 50
n = 5

def network():
    sequence = Input(shape=(timesteps, 1), name='Sequence')
    features = Input(shape=(n,), name='Features')

    conv = Sequential()
    conv.add(Conv1D(10, 5, activation='relu', input_shape=(timesteps, 1)))
    conv.add(Conv1D(10, 5, activation='relu'))
    conv.add(MaxPool1D(2))
    conv.add(Dropout(0.5, seed=789))

    conv.add(Conv1D(5, 6, activation='relu'))
    conv.add(Conv1D(5, 6, activation='relu'))
    conv.add(MaxPool1D(2))
    conv.add(Dropout(0.5, seed=789))
    conv.add(Flatten())
    part1 = conv(sequence)

    merged = concatenate([part1, features])

    final = Dense(512, activation='relu')(merged)
    final = Dropout(0.5, seed=789)(final)
    final = Dense(2, activation='softmax')(final)

    model = Model(inputs=[sequence, features], outputs=[final])

    model.compile(loss='logcosh', optimizer='adam', metrics=['accuracy'])

    return model

m = network()

顺序
模型不是很灵活。你应该调查这件事

我想试试这样的东西:

from keras.layers import (Conv1D, MaxPool1D, Dropout, Flatten, Dense,
                          Input, concatenate)
from keras.models import Model, Sequential

timesteps = 50
n = 5

def network():
    sequence = Input(shape=(timesteps, 1), name='Sequence')
    features = Input(shape=(n,), name='Features')

    conv = Sequential()
    conv.add(Conv1D(10, 5, activation='relu', input_shape=(timesteps, 1)))
    conv.add(Conv1D(10, 5, activation='relu'))
    conv.add(MaxPool1D(2))
    conv.add(Dropout(0.5, seed=789))

    conv.add(Conv1D(5, 6, activation='relu'))
    conv.add(Conv1D(5, 6, activation='relu'))
    conv.add(MaxPool1D(2))
    conv.add(Dropout(0.5, seed=789))
    conv.add(Flatten())
    part1 = conv(sequence)

    merged = concatenate([part1, features])

    final = Dense(512, activation='relu')(merged)
    final = Dropout(0.5, seed=789)(final)
    final = Dense(2, activation='softmax')(final)

    model = Model(inputs=[sequence, features], outputs=[final])

    model.compile(loss='logcosh', optimizer='adam', metrics=['accuracy'])

    return model

m = network()

你把它们分开是什么意思?在连接之后重新点燃它们。同时,我看到也可以将x值作为列表传递,第一项作为卷积的输入,第二项作为手工制作的功能。将它们分开是什么意思?在串联后重新拆分它们。与此同时,我发现也可以将x值作为列表传递,第一项作为卷积的输入,第二项作为手工制作的特性。