Got:
[W] [TRT] Default DLA is enabled but layer prob is not running on DLA, falling back to GPU.
[W] [TRT] Warning: no implementation of prob obeys the requested constraints, using a higher precision type
Q: Does the fallback applies only to the last layer (prob), or all the layers ? That is - do all the other layers run on the DLA, except the last one ?
and the performance of the GPU seems to be unchanged from running the DLA, is this true ?
It happened with the ResNet50 with the last prob layer, so now im a bit confused …
Are you aware of any classification and/or detection models that runs solely on the DLA, and is easy to train (say in DIGITS, or TLT) ? I tried the ResNet18 from the TLT, and ResNet50 from modelZoo (DIGITS), both throwing warnings of falling back to the GPU…