Batch size and the time consuming

Hi. I have tx2 and inastall caffe.
In the directory, /usr/src/tensorrt/bin, I commanded ‘giexec’.
Then I got the times.
What does times means? Is it time for classification? forward pass?

AND when I made batch size up like 4 of other bigger number, the time became longer than when the batch size is 1.
Why the time is longer? Shouldn’t time be shortened?

AND ‘giexec’ and other command give me just bench mark and I can’t use those commands for classification. Right?

AND how to classify images using alexnet and tensorrt. I want to classify the image and check the classification result and consuming time.

Seeing as it computes multiple images, the overall time for inferencing in batches is higher, just not 4 times higher. If you divide the overall time by 4 (or how many images were in your batch), it should be less than the time it took to compute just one image. By using batching, you should be able to compute the results in less time than it would have taken to compute them all individually. Let us know if that’s not the case in your instance.

For image classification, you are on the right track in this other thread you posted, it just seems you have a data corruption issue while downloading the pretrained deep learning models that come with the repo.

1 Like

So …What does these times (numbers) mean?
The time run forward once?To recognize one picture?