Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/docker/10.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Docker ';转置需要一个大小为5的向量。但输入(1)是大小为3的向量\n\t“;对tensorflow服务模型进行推理POST请求时_Docker_Tensorflow_Keras_Tensorflow Serving - Fatal编程技术网

Docker ';转置需要一个大小为5的向量。但输入(1)是大小为3的向量\n\t“;对tensorflow服务模型进行推理POST请求时

Docker ';转置需要一个大小为5的向量。但输入(1)是大小为3的向量\n\t“;对tensorflow服务模型进行推理POST请求时,docker,tensorflow,keras,tensorflow-serving,Docker,Tensorflow,Keras,Tensorflow Serving,我已经训练了一个模型,并将其部署到tensorflow中用于推理 我在发出请求时遇到以下错误: <Response [400]> {'error': 'transpose expects a vector of size 5. But input(1) is a vector of size 3\n\t [[{{node bidirectional_1/transpose}} = Transpose[T=DT_FLOAT, Tperm=DT_INT32, _class=["loc:@

我已经训练了一个模型,并将其部署到tensorflow中用于推理

我在发出请求时遇到以下错误:

<Response [400]>
{'error': 'transpose expects a vector of size 5. But input(1) is a vector of size 3\n\t [[{{node bidirectional_1/transpose}} = Transpose[T=DT_FLOAT, Tperm=DT_INT32, _class=["loc:@bidirectional_1/TensorArrayUnstack/TensorArrayScatter/TensorArrayScatterV3"], _output_shapes=[[50,?,512]], _device="/job:localhost/replica:0/task:0/device:CPU:0"](embedding_1/embedding_lookup, Attention/transpose/perm)]]'}
要创建用于tf服务的tensorflow模型对象,我使用以下函数:

def export_model_custom_layer(filename, export_path_base):
    # set the mode to test time.
    K.set_learning_phase(0)
    model = keras.models.load_model(filename, custom_objects={"Attention": Attention})
    sess = K.get_session()

    # set the path to save the model and model version
    export_version = 1

    export_path = os.path.join(
        tf.compat.as_bytes(export_path_base),
        tf.compat.as_bytes(str(export_version)))
    tf.saved_model.simple_save(
        sess,
        export_path,
        inputs={'input': model.input},
        outputs={t.name.split(':')[0]: t for t in model.outputs},
        legacy_init_op=tf.tables_initializer())
在我将客户层定义为自定义对象的地方,为了使其正常工作,我已将此功能添加到客户层:

    def get_config(self):
        config = {
            'name': "Attention"
                  }
        base_config = super(Attention, self).get_config()
        return dict(list(base_config.items()) + list(config.items()))
当我使用与tf服务模型使用标准keras model.predict()接收的数据格式相同的模型进行预测时,它会按预期工作:

class Attention(Layer):...

with open("CNN_last_test_set.pkl", "rb") as fp:
    x_arr_test, y_test = pickle.load(fp)

model = keras.models.load_model(r"Data/modelCNN.model", custom_objects={"Attention": Attention})
out = x_arr_test[:1, :]
test1 = out.shape
out = out.tolist()

test = model.predict([out])

>> print(test)
>> [[0.21351092]]
这让我相信,当我将模型从keras导出到.pb文件时,或者以某种方式在docker容器中运行模型时,问题就发生了

我不确定该如何处理这个错误,但我假设这与我的自定义图层对象有关,因为它与我以前的模型(仅包含标准Keras图层)一起工作

非常感谢您的帮助,谢谢

编辑:我解决了这个问题,问题是我的输入数据有两个额外的维度。我意识到,当我去掉变量“out”周围的括号时,我的错误从“transpose预期大小为5的向量”变为“transpose预期大小为4的向量”。因此,我将我的“out”变量从(1,50)改为(50),去掉括号,问题自行解决

class Attention(Layer):...

with open("CNN_last_test_set.pkl", "rb") as fp:
    x_arr_test, y_test = pickle.load(fp)

model = keras.models.load_model(r"Data/modelCNN.model", custom_objects={"Attention": Attention})
out = x_arr_test[:1, :]
test1 = out.shape
out = out.tolist()

test = model.predict([out])

>> print(test)
>> [[0.21351092]]