Hi,
Do we have some methods to quick try whether TensorRT support some ONNX models?like trtexec?
Hi,
Do we have some methods to quick try whether TensorRT support some ONNX models?like trtexec?
Hi bigcat,
Yes you can use trtexec to quickly test ONNX model in TensorRT - see example command below:
$ cd /usr/src/tensorrt/bin
$ ./trtexec --onnx=<path to the ONNX model> --fp16
Optionally you can specify the name of desired output layer with the --output argument
Hi,
I tried a resnet.onnx, but it failed as below, seems Flatten layer is not supported, but i have checked the TensorRT support matrix and the result is that Flatten layer in ONNX is supported.
WARNING: ONNX model has a newer ir_version (0.0.5) than this parser was built against (0.0.3).
While parsing node number 174 [Flatten → “@HUB_resnet_v2_50_imagenet@fc_0.w_0@flatten_0”]:
ERROR: /home/erisuser/p4sw/sw/gpgpu/MachineLearning/DIT/release/5.0/parsers/onnxOpenSource/builtin_op_importers.cpp:755 In function importFlatten:
[8] Assertion failed: inputs.at(0).is_tensor()
failed to parse onnx file
Engine could not be created
Engine could not be created[/i]
Hi,
I tried resnet50 onnx model from onnx zoo on Jetson Nano
Here is the link (using master tar file):
[url]https://github.com/onnx/models/tree/master/vision/classification/resnet/resnet50[/url]
Using same command as given by dusty:
./trtexec --onnx=/home/nvidia/models/resnet50.onnx --fp16
Also, I tried vgg19 from following link (using master tar file):
[url]https://github.com/onnx/models/tree/master/vision/classification/vgg/vgg19[/url]
./trtexec --onnx=/home/nvidia/models/vgg19.onnx --fp16
Both networks worked.
Please let us know.
Hi,
I have checked that the two ResNet are not the same. My ResNet has Flatten layer, I send the model to you, and please help to check
resnet.onnx.zip (90.8 MB)
Hi,
Do we have some updates now ?
Hi,
Do we have some updates now?
Hi,
Do you have some updates now?
This issue has blocked us moving forward, please share your solution, thanks
Hi,
Flatten is not default supported in TensorRT.
But you can find the related plugin in our sample.
/usr/src/tensorrt/samples/sampleUffSSD/sampleUffSSD.cpp
Thanks.