Running tensorflow models on DLA

Hello all,

I am currently converting checkpoints from:

https://github.com/tensorflow/models/tree/master/research/slim#Pretrained

to frozen models and uff format afterwards.

I am able to run trtexec for inception_v4 and resnet_v2_50 on gpu. However, when I set useDLA flag to run trtexec on NVDLA, it throws these errors:

uff: inception_v4.uff
output: InceptionV4/Logits/Logits/BiasAdd
uffInput: input,3,299,299
useDLA: 2
allowGPUFallback
fp16
Default DLA is enabled but layer InceptionV4/Logits/Logits/biases is not running on DLA, falling back to GPU.
.
.
.
.
Default DLA is enabled but layer InceptionV4/Logits/Logits/weights is not running on DLA, falling back to GPU.
../builder/cudnnBuilder2.cpp (689) - Misc Error in buildSingleLayer: 1 (Unable to process layer.)
../builder/cudnnBuilder2.cpp (689) - Misc Error in buildSingleLayer: 1 (Unable to process layer.)
could not build engine
Engine could not be created
Engine could not be created

Any ideas on what is wrong here?

Thanks,

Hi,

The error indicates that the logits layer fallbacked to use GPU but meet some issue on it.

Do you know how large your model is? Could you help to measure this with GPU mode first.
Some users found that using DLA will occupy more memory than pure GPU one.
Maybe you can start from a small model or batchsize=1.

Thanks.

Hi AastaLLL,

Inception_v4 works on gpu. Actually all models work on gpu using trtexec.

setting batchsize to 1 doesn’t help.

Changing model to vgg_16 works.

So how do I measure memory occupied by DLA and what should I do to run better models using NVDLA like resent_v2 or inception_v4 if that’s really a problem caused by memory?

Thanks.

Hi,

Thanks for the experiment.

We will try it and feedback to our internal team.
Will share with you once any progress.

Hi,

Sorry for keeping you waiting.

We can reproduce this issue on our environment now.
Will check it in detail and update information with you later.

Thanks.

Hi,

This issue is fixed in TensorRT 5.1.
Please wait for our announcement for the next release.

Thanks.

Hello AastaLLL,

Thank you very much for the updates.

Just to get a feeling of release time, is TF5.1 going to be released in few days, weeks or months from now?

Thanks

Hi,

Sorry that we cannot disclose any our future schedule.
But you should be able to get the TensorRT 5.1 package early this year.

Thanks.

I want to update trt form 5.0 to 5.1 on xavier, any ideas except for jetpack?

Thanks