I transfered one official sample in qt. The program can be built. But When the program is stepped over the follow place(i.e.IBuilder* builder = createInferBuilder(gLogger)),an error happen.
void caffeToGIEModel(const std::string& deployFile, // name for caffe prototxt
const std::string& modelFile, // name for model
const std::vectorstd::string& outputs, // network outputs
unsigned int maxBatchSize, // batch size - NB must be at least as large as the batch we want to run with)
nvcaffeparser1::IPluginFactory* pluginFactory, // factory for plugin layers
IHostMemory *gieModelStream) // output stream for the GIE model
{
// create the builder
IBuilder builder = createInferBuilder(gLogger);
The errors is as follows:
RTTI symbol not found for class ‘nvinfer1::Builder’
RTTI symbol not found for class ‘nvinfer1::Builder’
RTTI symbol not found for class ‘nvinfer1::Builder’
RTTI symbol not found for class ‘nvinfer1::Builder’
RTTI symbol not found for class ‘nvinfer1::Builder’
But in the .pro file, i have added the so files, shown as below:
I choose the running mode as release mode. Then run the program, however, when the program runs over the following code,(i.e.
const IBlobNameToTensor* blobNameToTensor = parser->parse(deployFile.c_str(),
modelFile.c_str(),
*network,
DataType::kFLOAT); )
in the function (i.e. caffeToGIEModel),the following errors happen:
Begin parsing model…
The program has unexpectedly finished.
/home/nvidia/lxm/tensorrt_mobilenet_ssd/build-mobilenet_ssd_tensorRT-JetsonTX2-Release/mobilenet_ssd_tensorRT crashed
I have tried my best to solve the problem. Now the new error is -----------------“Begin parsing model…
ERROR: Parameter check failed at: Layers.h::PluginLayer::619, condition: (inputs[0]) != NULL
Plugin layer output count is not equal to caffe output count”-------------- when tensorRT parser the caffemodel, i.e. my program step over the code----------- “const IBlobNameToTensor* blobNameToTensor = parser->parse(deployFile.c_str(),modelFile.c_str(),*network,DataType::kFLOAT);”------------ ,shown as below: