使用tensorflow 2.3 c+时会话选项崩溃+;加载模型

使用tensorflow 2.3 c+时会话选项崩溃+;加载模型,tensorflow,crash,tensorflow2.x,invalid-pointer,Tensorflow,Crash,Tensorflow2.x,Invalid Pointer,我已经编写了一个程序来加载模型。同一个模型文件正在加载tensorflow1.15,但tf 2.3因错误而崩溃 *** Error in `./a.out': free(): invalid pointer: 0x00002b10ba7907c0 *** 下面提供了崩溃消息和回溯 2020-11-05 20:31:55.295879: I tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /atom/model/3

我已经编写了一个程序来加载模型。同一个模型文件正在加载tensorflow1.15,但tf 2.3因错误而崩溃

*** Error in `./a.out': free(): invalid pointer: 0x00002b10ba7907c0 ***
下面提供了崩溃消息和回溯

2020-11-05 20:31:55.295879: I tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /atom/model/30/1/model
2020-11-05 20:31:55.304575: I tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
2020-11-05 20:31:55.304625: I tensorflow/cc/saved_model/loader.cc:234] Reading SavedModel debug info (if present) from: /atom/model/30/1/model
2020-11-05 20:31:55.322785: I tensorflow/core/platform/profile_utils/cpu_utils.cc:104] CPU Frequency: 2800270000 Hz
2020-11-05 20:31:55.323754: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x2c2b340 initialized for platform Host (this does not guarantee that XLA will be used). Devices:
2020-11-05 20:31:55.323783: I tensorflow/compiler/xla/service/service.cc:176]   StreamExecutor device (0): Host, Default Version
2020-11-05 20:31:55.355104: I tensorflow/cc/saved_model/loader.cc:199] Restoring SavedModel bundle.
2020-11-05 20:31:55.465703: I tensorflow/cc/saved_model/loader.cc:183] Running initialization op on SavedModel bundle at path: /atom/model/30/1/model
2020-11-05 20:31:55.486363: I tensorflow/cc/saved_model/loader.cc:303] SavedModel load for tags { serve }; Status: success: OK. Took 190494 microseconds.
*** Error in `./a.out': free(): invalid pointer: 0x00002b10ba7907c0 ***
======= Backtrace: =========
/lib64/libc.so.6(+0x81299)[0x2b10bb5b2299]
./a.out(_ZNSsD1Ev+0x62)[0x4723d2]
./a.out(_ZN10tensorflow14SessionOptionsD1Ev+0x2c)[0x433f08]
./a.out[0x4337bf]
/lib64/libc.so.6(__libc_start_main+0xf5)[0x2b10bb553555]
./a.out[0x433545]
(gdb)bt
#0 0x00002AABA446387位于/lib64/libc.so.6的raise()中
#1 0x00002AABA447A78位于/lib64/libc.so.6的abort()中
#来自/lib64/libc.so.6的消息()中的0x00002AABA488ED7
#3/lib64/libc.so.6中的0x00002aaba491299 in_int_free()
#std::basic_string::~basic_string()()
#tensorflow::SessionOptions::~SessionOptions()中的5 0x0000000000433f08()
#主管道中的6 0x00000000004337bf()
系统和构建信息

  • TF:v2.3使用proto 3.12(c++api)构建
  • OS:centos7
  • 开发环境:devtoolset-7
  • 巴泽尔:3.5
  • 系统:谷歌云机与Intel Cascade Lake平台
构建过程

  • 从github克隆了v2.3
  • 配置为使用python2运行,且无任何额外选项
  • Build命令:BAZEL_LINKLIBS=-l%:libstdc++.a BAZEL Build-c opt--copt=-march=native--config=singlobal//tensorflow:libtensorflow_cc.so
构建后步骤

  • sudo mkdir/usr/local/include/tensorflow
  • sudo cp-r bazel bin/tensorflow/core/usr/local/include/tensorflow
  • sudo cp-r tensorflow/cc/usr/local/include/tensorflow/
  • sudo cp-r tensorflow/core/usr/local/include/tensorflow/
  • sudo cp-r第三方/usr/local/include/
  • sudo cp bazel bin/tensorflow/libtensorflow_cc.so/usr/local/lib/
测试代码片段

模型文件夹位于tar文件中。我排除了提取代码


#include <tensorflow/cc/saved_model/loader.h>
#include <tensorflow/cc/saved_model/tag_constants.h>
// Load
using namespace tensorflow;
int main() {
        SavedModelBundle model_bundle;
        SessionOptions session_options = SessionOptions();
        RunOptions run_options = RunOptions();
        Status status = LoadSavedModel(session_options, run_options,"/atom/model/30/1/model", {"serve"}, &model_bundle);
}

#包括
#包括
//装载
使用名称空间tensorflow;
int main(){
保存的模型绑定模型绑定;
SessionOptions session_options=SessionOptions();
RunOptions run_options=RunOptions();
Status Status=LoadSavedModel(会话选项、运行选项、/atom/model/30/1/model、{“服务”}、&model\bundle);
}
请帮助我理解什么是错误的,我能做些什么来解决这个问题


#include <tensorflow/cc/saved_model/loader.h>
#include <tensorflow/cc/saved_model/tag_constants.h>
// Load
using namespace tensorflow;
int main() {
        SavedModelBundle model_bundle;
        SessionOptions session_options = SessionOptions();
        RunOptions run_options = RunOptions();
        Status status = LoadSavedModel(session_options, run_options,"/atom/model/30/1/model", {"serve"}, &model_bundle);
}