TensorRT 3.0 RC now available with support for TensorFlow
[quote][b]note:[/b] the [b][url=https://devtalk.nvidia.com/default/topic/1027301/jetson-tx2/jetpack-3-2-mdash-l4t-r28-2-developer-preview-for-jetson-tx2/]JetPack 3.2 Developer Preview[/url][/b] is now available for Jetson TX2 with automated install of TensorRT 3.0 RC2.[/quote]A release candidate package for [url=https://developer.nvidia.com/tensorrt]TensorRT 3.0[/url] and cuDNN 7.0 is now available for Jetson TX1/TX2. Intended to be installed on top of an existing JetPack 3.1 installation, the TensorRT 3.0 RC provides the latest performance improvements and features: [list] [.]TensorFlow 1.3 UFF (Universal Framework Format) importer[/.] [.]Upgrade from cuDNN 6 to cuDNN 7[/.] [.]New layers and parameter types[/.] [/list] See the full [b][url=https://developer.nvidia.com/compute/machine-learning/tensorrt/secure/3.0/rc1/TensorRT3-Release-Notes-RC-pdf]Release Notes[/url][/b] here and download the RC from [b][url=https://developer.nvidia.com/tensorrt]developer.nvidia.com/tensorrt[/url][/b]
note: the JetPack 3.2 Developer Preview is now available for Jetson TX2 with automated install of TensorRT 3.0 RC2.
A release candidate package for TensorRT 3.0 and cuDNN 7.0 is now available for Jetson TX1/TX2. Intended to be installed on top of an existing JetPack 3.1 installation, the TensorRT 3.0 RC provides the latest performance improvements and features:

  • TensorFlow 1.3 UFF (Universal Framework Format) importer
  • Upgrade from cuDNN 6 to cuDNN 7
  • New layers and parameter types

See the full Release Notes here and download the RC from developer.nvidia.com/tensorrt
#1
Posted 09/26/2017 01:16 PM   
Hi, glad to know the release of tensorrt3. But I want to know how to use Python API of tensorrt3? I have tried the provided methods for installing tensorrt given here:[url]https://developer.nvidia.com/nvidia-tensorrt3rc-download[/url]. And after that, I still can't "import tensorrt" within a python environment. Can I install tensorrt3 in a python way through "sudo apt-get install python-tensorrt"?
Hi, glad to know the release of tensorrt3. But I want to know how to use Python API of tensorrt3? I have tried the provided methods for installing tensorrt given here:https://developer.nvidia.com/nvidia-tensorrt3rc-download. And after that, I still can't "import tensorrt" within a python environment. Can I install tensorrt3 in a python way through "sudo apt-get install python-tensorrt"?

#2
Posted 09/27/2017 04:08 AM   
Hi, what the new layers and parameter types tensorRT3 support? and what version of caffe it support? by the way, can we install ver2 and ver3 simultaneously in our tx2?
Hi,

what the new layers and parameter types tensorRT3 support?

and what version of caffe it support?


by the way, can we install ver2 and ver3 simultaneously in our tx2?

#3
Posted 09/27/2017 06:26 AM   
[quote=""]what the new layers and parameter types tensorRT3 support?[/quote]Please consult the [b][url=https://developer.nvidia.com/compute/machine-learning/tensorrt/secure/3.0/rc1/TensorRT3-Release-Notes-RC-pdf]Release Notes[/url][/b] for the new layers and parameter types: [list] [.]The TensorRT deconvolution layer previously did not support non-zero padding, or stride values that were distinct from kernel size. These restrictions have now been lifted.[/.] [.]The TensorRT deconvolution layer now supports groups.[/.] [.]Non-determinism in the deconvolution layer implementation has been eliminated.[/.] [.]The TensorRT convolution layer API now supports dilated convolutions.[/.] The TensorRT API now supports these new layers (but they are not supported via the NvCaffeParser): [.]unary[/.] [.]shuffle[/.] [.]padding[/.] [.]The Elementwise (eltwise) layer now supports broadcasting of input dimensions.[/.] [/list] [quote=""]what version of caffe it support?[/quote]I'm confirming internally which version caffe should be used for training (nvcaffe-0.15 or 0.16)[/quote] [quote=""]by the way, can we install ver2 and ver3 simultaneously in our tx2?[/quote]If you extract the tarball provided on the TensorRT downloads page, you should be able to, however probably not with the debian package method.
said:what the new layers and parameter types tensorRT3 support?
Please consult the Release Notes for the new layers and parameter types:

  • The TensorRT deconvolution layer previously did not support non-zero padding,
    or stride values that were distinct from kernel size. These restrictions have now
    been lifted.
  • The TensorRT deconvolution layer now supports groups.
  • Non-determinism in the deconvolution layer implementation has been
    eliminated.
  • The TensorRT convolution layer API now supports dilated convolutions.

  • The TensorRT API now supports these new layers (but they are not supported via
    the NvCaffeParser):

  • unary
  • shuffle
  • padding
  • The Elementwise (eltwise) layer now supports broadcasting of input dimensions.

said:what version of caffe it support?
I'm confirming internally which version caffe should be used for training (nvcaffe-0.15 or 0.16)

said:by the way, can we install ver2 and ver3 simultaneously in our tx2?
If you extract the tarball provided on the TensorRT downloads page, you should be able to, however probably not with the debian package method.
#4
Posted 09/27/2017 03:03 PM   
[quote=""]Hi, glad to know the release of tensorrt3. But I want to know how to use Python API of tensorrt3? I have tried the provided methods for installing tensorrt given here:[url]https://developer.nvidia.com/nvidia-tensorrt3rc-download[/url]. And after that, I still can't "import tensorrt" within a python environment. Can I install tensorrt3 in a python way through "sudo apt-get install python-tensorrt"?[/quote]Hi, from the tarball provided on that same download page, check the python directory from the extracted tarball. Within there are the python files, docs, and samples. I will check where these get installed to when using the debian package installation method.
said:Hi, glad to know the release of tensorrt3. But I want to know how to use Python API of tensorrt3? I have tried the provided methods for installing tensorrt given here:https://developer.nvidia.com/nvidia-tensorrt3rc-download. And after that, I still can't "import tensorrt" within a python environment. Can I install tensorrt3 in a python way through "sudo apt-get install python-tensorrt"?
Hi, from the tarball provided on that same download page, check the python directory from the extracted tarball. Within there are the python files, docs, and samples. I will check where these get installed to when using the debian package installation method.
#5
Posted 09/27/2017 03:07 PM   
Using the debian package, I could find `examples` folder here /usr/local/lib/python2.7/dist-packages/tensorrt
Using the debian package, I could find `examples` folder here /usr/local/lib/python2.7/dist-packages/tensorrt

#6
Posted 09/27/2017 08:26 PM   
Well it seems that, tensorrt3 for tesla gpus have access to python API. I mean the tarball provided for tesla gpus has python directory and there do exist some python files in that directory. While tarball provided for jetson platforms has a python directory too, but there just exist some doc and data files in that directory. Does that mean jetson platforms,e.g.TX1,TX2, cannot use tensorrt3 API?
Well it seems that, tensorrt3 for tesla gpus have access to python API. I mean the tarball provided for tesla gpus has python directory and there do exist some python files in that directory. While tarball provided for jetson platforms has a python directory too, but there just exist some doc and data files in that directory. Does that mean jetson platforms,e.g.TX1,TX2, cannot use tensorrt3 API?

#7
Posted 09/28/2017 01:28 AM   
Thx for you explanation. Is there any plan in the future to directly import layer.cpp or layer.cu by tensorRT caffeParser? or for the custom layer, we still have to add parser rule in Iplugin.
Thx for you explanation.


Is there any plan in the future to directly import layer.cpp or layer.cu by tensorRT caffeParser?

or for the custom layer, we still have to add parser rule in Iplugin.

#8
Posted 09/28/2017 02:14 AM   
Regarding your questions, let me check with the TensorRT team about it.
Regarding your questions, let me check with the TensorRT team about it.
#9
Posted 10/02/2017 01:23 PM   
I was wondering if anyone could recommend a TensorFlow model to test the complete process with? Where I could use Digits w/TensorFlow1.3 to Train the model, and then use TensorRT 3.0 RC on the Jetson for inference.
I was wondering if anyone could recommend a TensorFlow model to test the complete process with?

Where I could use Digits w/TensorFlow1.3 to Train the model, and then use TensorRT 3.0 RC on the Jetson for inference.

#10
Posted 10/04/2017 04:27 PM   
[quote=""]I was wondering if anyone could recommend a TensorFlow model to test the complete process with? Where I could use Digits w/TensorFlow1.3 to Train the model, and then use TensorRT 3.0 RC on the Jetson for inference.[/quote]See section 2.3.2.1.1. (Training a Model in TensorFlow) from the TensorRT 3 User Guide (included in the RC download) for example TensorFlow code of training an example network model compatible with TensorRT. The DIGITS examples also include some TensorFlow models, see here: [url]https://github.com/NVIDIA/DIGITS/tree/master/examples[/url]
said:I was wondering if anyone could recommend a TensorFlow model to test the complete process with?

Where I could use Digits w/TensorFlow1.3 to Train the model, and then use TensorRT 3.0 RC on the Jetson for inference.
See section 2.3.2.1.1. (Training a Model in TensorFlow) from the TensorRT 3 User Guide (included in the RC download) for example TensorFlow code of training an example network model compatible with TensorRT.

The DIGITS examples also include some TensorFlow models, see here: https://github.com/NVIDIA/DIGITS/tree/master/examples
#11
Posted 10/06/2017 03:13 PM   
The Release Notes were amended to clarify the Python API is currently x86-only in the RC: [quote]The TensorRT Python APIs are only supported on x86 based systems. Some installation packages for ARM based systems may contain Python .whl files. Do not install these on the ARM systems, as they will not function.[/quote] TensorFlow UFF models can still be imported on ARM platforms from C++ using the NvUffParser.h API.
The Release Notes were amended to clarify the Python API is currently x86-only in the RC:

The TensorRT Python APIs are only supported on x86 based systems. Some
installation packages for ARM based systems may contain Python .whl files. Do not
install these on the ARM systems, as they will not function.

TensorFlow UFF models can still be imported on ARM platforms from C++ using the NvUffParser.h API.
#12
Posted 10/06/2017 03:15 PM   
Hi, I tried the examples provided by TensorRT-3-User-Guide.pdf. In part 2.3.2.1.3, the code collapse and I get the following erro. Can you give me some tips about what's going wrong? And b.t.w, is there any Python/C++ API of Tensorrt3 provided? I don't even know what variables and functions are provided for each python module and it's a total mess when trying to use TensorRT3. - -! Traceback (most recent call last): File "/home/pc-201/Desktop/a.py", line 181, in <module> parser.register_input("Placeholder", (1,28,28)) NotImplementedError: Wrong number or type of arguments for overloaded function 'UffParser_register_input'. Possible C/C++ prototypes are: nvuffparser::IUffParser::registerInput(char const *,nvinfer1::DimsCHW,nvuffparser::UffInputOrder) nvuffparser::IUffParser::registerInput(char const *,nvinfer1::DimsCHW) nvuffparser::IUffParser::registerInput(PyObject *,PyObject *,nvuffparser::UffInputOrder) nvuffparser::IUffParser::registerInput(PyObject *,nvinfer1::DimsCHW,nvuffparser::UffInputOrder) nvuffparser::IUffParser::registerInput(char const *,PyObject *,nvuffparser::UffInputOrder)
Hi, I tried the examples provided by TensorRT-3-User-Guide.pdf. In part 2.3.2.1.3, the code collapse and I get the following erro. Can you give me some tips about what's going wrong? And b.t.w, is there any Python/C++ API of Tensorrt3 provided? I don't even know what variables and functions are provided for each python module and it's a total mess when trying to use TensorRT3. - -!
Traceback (most recent call last):
File "/home/pc-201/Desktop/a.py", line 181, in <module>
parser.register_input("Placeholder", (1,28,28))
NotImplementedError: Wrong number or type of arguments for overloaded function 'UffParser_register_input'.
Possible C/C++ prototypes are:
nvuffparser::IUffParser::registerInput(char const *,nvinfer1::DimsCHW,nvuffparser::UffInputOrder)
nvuffparser::IUffParser::registerInput(char const *,nvinfer1::DimsCHW)
nvuffparser::IUffParser::registerInput(PyObject *,PyObject *,nvuffparser::UffInputOrder)
nvuffparser::IUffParser::registerInput(PyObject *,nvinfer1::DimsCHW,nvuffparser::UffInputOrder)
nvuffparser::IUffParser::registerInput(char const *,PyObject *,nvuffparser::UffInputOrder)

#13
Posted 10/09/2017 11:16 AM   
Hi, I have searched the TensorRT-3-User-Guide, but I can not find. Could you give me the address here?
Hi, I have searched the TensorRT-3-User-Guide, but I can not find. Could you give me the address here?

#14
Posted 10/11/2017 03:34 AM   
Does tensorrt 3 for tensorflow support custom layers defined in tensorflow graph?
Does tensorrt 3 for tensorflow support custom layers defined in tensorflow graph?

#15
Posted 10/11/2017 08:13 AM   
Scroll To Top

Add Reply