Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/346.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 如何保存由冻结推理图生成的TensorRT图?_Python_Tensorflow_Tensorrt - Fatal编程技术网

Python 如何保存由冻结推理图生成的TensorRT图?

Python 如何保存由冻结推理图生成的TensorRT图?,python,tensorflow,tensorrt,Python,Tensorflow,Tensorrt,我使用以下脚本将冻结的推理图转换为TensorRT优化图: import tensorflow as tf from tensorflow.python.compiler.tensorrt import trt_convert as trt with tf.Session() as sess: # First deserialize your frozen graph: with tf.gfile.GFile('frozen_inference_graph.pb', 'rb')

我使用以下脚本将冻结的推理图转换为TensorRT优化图:

import tensorflow as tf
from tensorflow.python.compiler.tensorrt import trt_convert as trt

with tf.Session() as sess:
    # First deserialize your frozen graph:
    with tf.gfile.GFile('frozen_inference_graph.pb', 'rb') as f:
        frozen_graph = tf.GraphDef()
        frozen_graph.ParseFromString(f.read())
    # Now you can create a TensorRT inference graph from your
    # frozen graph:
    converter = trt.TrtGraphConverter(
        input_graph_def=frozen_graph,
        nodes_blacklist=['outputs/Softmax']) #output nodes
    trt_graph = converter.convert()
    # Import the TensorRT graph into a new graph and run:
    output_node = tf.import_graph_def(
        trt_graph,
        return_elements=['outputs/Softmax'])
    sess.run(output_node)

我的问题是如何将优化后的图形保存到磁盘,以便使用它运行推理?

是的,您可以添加这两行:

saved_model_dir_trt=“./tensorrt_model.trt”
converter.save(已保存的\u model\u dir\u trt)