Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/tensorflow/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python ValueError:否';提供默认服务';在保存的模型中';她签名了。可能的值为'';_Python_Tensorflow_Machine Learning_Tensorflow Lite - Fatal编程技术网

Python ValueError:否';提供默认服务';在保存的模型中';她签名了。可能的值为'';

Python ValueError:否';提供默认服务';在保存的模型中';她签名了。可能的值为'';,python,tensorflow,machine-learning,tensorflow-lite,Python,Tensorflow,Machine Learning,Tensorflow Lite,我有.data、.index和.meta,我能够创建一个保存的_model.pb和一个变量保持器,形成一个TensorFlow脚本 当我运行以下命令时 tflite_convert --output_file='/home/tensor/Work/cr/saved.tflite' --saved_model_dir='/home/tensor/Work/cr/model_out' 这给了我错误 ValueError: No 'serving_default' in the SavedModel

我有.data、.index和.meta,我能够创建一个保存的_model.pb和一个变量保持器,形成一个TensorFlow脚本

当我运行以下命令时

tflite_convert --output_file='/home/tensor/Work/cr/saved.tflite' --saved_model_dir='/home/tensor/Work/cr/model_out'
这给了我错误

ValueError: No 'serving_default' in the SavedModel's SignatureDefs. Possible values are ''.

我想将此.pb文件转换为.tflite。有人能告诉我如何解决这个错误吗?

您需要一个“推理图”来转换为TFLite

为此,您需要导出一个将所有变量转换为常量的图形(因为TFLite实际上不会进行任何训练)。此转换的说明,特别是以下代码片段:

import os, argparse

import tensorflow as tf

# The original freeze_graph function
# from tensorflow.python.tools.freeze_graph import freeze_graph 

dir = os.path.dirname(os.path.realpath(__file__))

def freeze_graph(model_dir, output_node_names):
    """Extract the sub graph defined by the output nodes and convert 
    all its variables into constant 
    Args:
        model_dir: the root folder containing the checkpoint state file
        output_node_names: a string, containing all the output node's names, 
                            comma separated
    """
    if not tf.gfile.Exists(model_dir):
        raise AssertionError(
            "Export directory doesn't exists. Please specify an export "
            "directory: %s" % model_dir)

    if not output_node_names:
        print("You need to supply the name of a node to --output_node_names.")
        return -1

    # We retrieve our checkpoint fullpath
    checkpoint = tf.train.get_checkpoint_state(model_dir)
    input_checkpoint = checkpoint.model_checkpoint_path

    # We precise the file fullname of our freezed graph
    absolute_model_dir = "/".join(input_checkpoint.split('/')[:-1])
    output_graph = absolute_model_dir + "/frozen_model.pb"

    # We clear devices to allow TensorFlow to control on which device it will load operations
    clear_devices = True

    # We start a session using a temporary fresh Graph
    with tf.Session(graph=tf.Graph()) as sess:
        # We import the meta graph in the current default Graph
        saver = tf.train.import_meta_graph(input_checkpoint + '.meta', clear_devices=clear_devices)

        # We restore the weights
        saver.restore(sess, input_checkpoint)

        # We use a built-in TF helper to export variables to constants
        output_graph_def = tf.graph_util.convert_variables_to_constants(
            sess, # The session is used to retrieve the weights
            tf.get_default_graph().as_graph_def(), # The graph_def is used to retrieve the nodes 
            output_node_names.split(",") # The output node names are used to select the usefull nodes
        ) 

        # Finally we serialize and dump the output graph to the filesystem
        with tf.gfile.GFile(output_graph, "wb") as f:
            f.write(output_graph_def.SerializeToString())
        print("%d ops in the final graph." % len(output_graph_def.node))

    return output_graph_def

您能否提供一些有关为获取保存的模型而运行的命令的详细信息?我们需要一个“推断”图来转换为tflite,它似乎不存在于您保存的模型中。