Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/tensorflow/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 如何通过导出保存的模型保存tensorflow估计器,然后在本地加载并使用它?_Python_Tensorflow_Tensorflow Serving - Fatal编程技术网

Python 如何通过导出保存的模型保存tensorflow估计器,然后在本地加载并使用它?

Python 如何通过导出保存的模型保存tensorflow估计器,然后在本地加载并使用它?,python,tensorflow,tensorflow-serving,Python,Tensorflow,Tensorflow Serving,我培训了tf线性回归估计器,如下所示: sample_size = train_x.shape[0] feature_size = train_x.shape[1] feature_columns = [tf.feature_column.numeric_column("x", shape=[feature_size])] lr_estimator = tf.estimator.LinearRegressor(feature_columns=feature_columns ) train_

我培训了tf线性回归估计器,如下所示:

sample_size = train_x.shape[0]
feature_size = train_x.shape[1]

feature_columns = [tf.feature_column.numeric_column("x", shape=[feature_size])]

lr_estimator = tf.estimator.LinearRegressor(feature_columns=feature_columns )

train_x_mat = train_x.as_matrix()
test_x_mat = test_x.as_matrix()


# Define the training inputs
train_input_fn = tf.estimator.inputs.numpy_input_fn(
    x={"x": train_x_mat},
    y=np.array(train_y_mat),
    num_epochs=None,
    shuffle=True)

# Train model.
lr_estimator.train(input_fn=train_input_fn, steps=2000)
其中,列x和列y是数据帧。lr_估计器确实有效,我可以调用。成功预测

如何将其保存到文件中,然后将其加载回以供以后预测?我只是想构建一个小的python程序。预测程序将在同一桌面上运行。我还不需要复杂的服务器服务

def serving_input_receiver_fn():
    """
    input placeholder
    """
    inputs = {"x": tf.placeholder(shape=[feature_size], dtype=tf.float32)}
    return tf.estimator.export.ServingInputReceiver(inputs, inputs)

# export model and weights
export_dir = est_inception_v3.export_savedmodel(export_dir_base="/export_dir", 
    serving_input_receiver_fn=serving_input_receiver_fn)

# restore from disk
with tf.Session() as sess:
    tf.saved_model.loader.load(sess, [tf.saved_model.tag_constants.SERVING], export_dir)
    predictor = SavedModelPredictor(export_dir)
    print(predictor({"x": test_x_mat}))