Python Keras多模型Api

Python Keras多模型Api,python,keras,Python,Keras,我想用keras和双模型(模型a和模型B)构建一个restapi服务: 我找到了这个例子,但是只使用了一个模型,我需要一些我可以使用的东西,比如 curl-X POST-F image=@typeA.jpg“http://localhost:5000/predictA" curl-X POST-F image=@typeB.jpg”http://localhost:5000/predictB" 您只需加载不同的模型,将其作为全局变量添加到应用程序中,如下所示: #创建烧瓶应用程序并初始化Kera

我想用keras和双模型(模型a和模型B)构建一个restapi服务: 我找到了这个例子,但是只使用了一个模型,我需要一些我可以使用的东西,比如

curl-X POST-F image=@typeA.jpg“http://localhost:5000/predictA"

curl-X POST-F image=@typeB.jpg”http://localhost:5000/predictB"


您只需加载不同的模型,将其作为全局变量添加到应用程序中,如下所示:

#创建烧瓶应用程序并初始化Keras模型
app=烧瓶。烧瓶(\uuuuu名称\uuuuuuu)
app.config['modelA']=load_model(“modelAPath”)
app.config['modelB']=load_model(“modelAPath”)
#还为每个模型创建两个单独的图形。
app.config['graphA']=tf.Graph()
app.config['graphB']=tf.Graph()
然后为每个模型指定各自的端点,例如:

@app.route(“/predictA”,methods=[“POST”])
def predictA():
#在这里获取数据,将其馈送到模型并返回json结果。
使用app.config['graphA'].as_default():
app.config['modelA'].predict()
@app.route(“/predictB”,methods=[“POST”])
def predictB():
#在这里获取数据,将其馈送到模型并返回json结果。
使用app.config['graphB'].as_default():
app.config['modelB'].predict()

请注意,这是基于您需要两个独立的端点,而您也可以使用一个端点,使用额外的“POST”ed表单或json参数来选择模型。

感谢您的回复,但在使用app['graphA']调用时,似乎出现了问题。默认值():TypeError:'Flask'对象不是Subscriptable您是对的,有一个输入错误。应该是app.config[“graphA”]。应该用什么代码来修复它?我更新了上面的答案。app[“graphA”]现在是app.config[“graphA”]看起来像潘多拉游戏,现在它说“不支持在图形模式下调用model.predict”,就像他们在
# keras_server.py 
  
# Python program to expose a ML model as flask REST API 
  
# import the necessary modules 
from keras.models import load_model
from keras.preprocessing.image import img_to_array  
from keras.applications import imagenet_utils 
import tensorflow as tf 
from PIL import Image 
import numpy as np 
import flask 
import io 
  
# Create Flask application and initialize Keras model 
app = flask.Flask(__name__) 
model = None
  
# Function to Load the model  
def load_model(): 
      
    # global variables, to be used in another function 
    global model      
    model = load_model("modelAPath") 
    global graph  
    graph = tf.get_default_graph() 
  
# Every ML/DL model has a specific format 
# of taking input. Before we can predict on 
# the input image, we first need to preprocess it. 
def prepare_image(image, target): 
    if image.mode != "RGB": 
        image = image.convert("RGB") 
      
    # Resize the image to the target dimensions 
    image = image.resize(target)  
      
    # PIL Image to Numpy array 
    image = img_to_array(image)  
      
    # Expand the shape of an array, 
    # as required by the Model 
    image = np.expand_dims(image, axis = 0)  
      
    # preprocess_input function is meant to 
    # adequate your image to the format the model requires 
    image = imagenet_utils.preprocess_input(image)  
  
    # return the processed image 
    return image 
  
# Now, we can predict the results. 
@app.route("/predict", methods =["POST"]) 
def predict(): 
    data = {} # dictionary to store result 
    data["success"] = False
  
    # Check if image was properly sent to our endpoint 
    if flask.request.method == "POST": 
        if flask.request.files.get("image"): 
            image = flask.request.files["image"].read() 
            image = Image.open(io.BytesIO(image)) 
  
            # Resize it to 224x224 pixels  
            # (required input dimensions for ResNet) 
            image = prepare_image(image, target =(224, 224)) 
  
        # Predict ! global preds, results 
            with graph.as_default(): 
                preds = model.predict(image) 
                results = imagenet_utils.decode_predictions(preds) 
                data["predictions"] = [] 
  
          
            for (ID, label, probability) in results[0]: 
                r = {"label": label, "probability": float(probability)} 
                data["predictions"].append(r) 
  
            data["success"] = True
  
    # return JSON response 
    return flask.jsonify(data) 
  
  
  
if __name__ == "__main__": 
    print(("* Loading Keras model and Flask starting server..."
        "please wait until server has fully started")) 
    load_model() 
    app.run()