如何将Tensorflow.pb模型转换为Tensorflow Lite
我需要使用将tensorflow pb模型转换为tensorflow lite 转换程序如下: 1) 要上载模型,请执行以下操作:如何将Tensorflow.pb模型转换为Tensorflow Lite,tensorflow,model,Tensorflow,Model,我需要使用将tensorflow pb模型转换为tensorflow lite 转换程序如下: 1) 要上载模型,请执行以下操作: from google.colab import files pbfile = files.upload() 2) 要转换它,请执行以下操作: import tensorflow as tf pb_file = 'data_513.pb' tflite_file = 'data_513.tlite' converter = tf.lite.TFLiteConve
from google.colab import files
pbfile = files.upload()
2) 要转换它,请执行以下操作:
import tensorflow as tf
pb_file = 'data_513.pb'
tflite_file = 'data_513.tlite'
converter = tf.lite.TFLiteConverter.from_frozen_graph(pb_file, ['ImageTensor'], ['SemanticPredictions'],
input_shapes={"ImageTensor":[1,513,513,3]})
tflite_model = converter.convert()
open(tflite_file,'wb').write(tflite_model)
转换失败,出现下一个错误
检查失败:array.data\u type==array.final\u data\u type数组“ImageTensor”的实际和最终数据类型不匹配(data\u type=uint8,final\u data\u type=float)。
我想我可能需要指定一些额外的命令来克服这个错误,但是我找不到任何关于它的信息。最终找到了解决方案。此处剪下的供其他人使用:
import tensorflow as tf
pb_file = 'model.pb'
tflite_file = 'model.tflite'
converter = tf.lite.TFLiteConverter.from_frozen_graph(pb_file, ['ImageTensor'], ['SemanticPredictions'],
input_shapes={"ImageTensor":[1,513,513,3]})
converter.inference_input_type=tf.uint8
converter.quantized_input_stats = {'ImageTensor': (128, 127)} # (mean, stddev)
tflite_model = converter.convert()
open(tflite_file,'wb').write(tflite_model)
interpreter = tf.lite.Interpreter(model_content=tflite_model)
interpreter.allocate_tensors()
files.download(tflite_file)
终于找到了解决办法。此处剪下的供其他人使用:
import tensorflow as tf
pb_file = 'model.pb'
tflite_file = 'model.tflite'
converter = tf.lite.TFLiteConverter.from_frozen_graph(pb_file, ['ImageTensor'], ['SemanticPredictions'],
input_shapes={"ImageTensor":[1,513,513,3]})
converter.inference_input_type=tf.uint8
converter.quantized_input_stats = {'ImageTensor': (128, 127)} # (mean, stddev)
tflite_model = converter.convert()
open(tflite_file,'wb').write(tflite_model)
interpreter = tf.lite.Interpreter(model_content=tflite_model)
interpreter.allocate_tensors()
files.download(tflite_file)