About TensorRT2.1 SampleInt8

I update my TX2 to use tensorTR 2.1
I want to test Int8 inference imporvement
after makefile
but when I run the sampleInt8
and I got some error I never see

runnign command:
./sample_int8 mnist

Error message:
Int8 support requested on hardware without native Int8 support, performance will be negatively affected.
ERROR LAUNCHING INT8-to-INT8 GEMM: 8

Hi,

INT8 doesn’t support TX1 and TX2.
[url]https://devtalk.nvidia.com/default/topic/1001033/jetson-tx2/quantization-of-weight-activation-on-tx2/post/5185049/#5185049[/url]

Sorry for the inconvenience.

Hello
So it’s not possible in8 support on tx2
Because of hardware support ??

YES, INT8 only support sm_61 architecture but TX2 is sm_62.