Video Encode and Decode GPU Support Matrix

Hi everyone,

Hoping you all might find this matrix helpful.

[url]https://developer.nvidia.com/video-encode-decode-gpu-support-matrix[/url]

Cheers,
Tom

Nvidia DevTalk Community Manager

3 Likes

Where is GeForce GTX 1080 in NVenc table ?
Have all GP104 crippled chips (GeForce GTX some 1060, 1070, 1070 Ti) two NVenc ?
Why has GP104 based GTX 1060 different capabilities in NVdec than other GP104 ?

Hello mcerveny,

Thanks for the questions. We are looking into this now. I will update this thread as soon as I have answers.

Thanks,
Tom

Could the GPU support matrix be updated with the new Turing GPU’s (TU102 and TU104)?

Come on NVIDIA, update the table/SDK - you can buy RTX 2080 now, but there is zero information as to what is needed to get the best out of it - does it need the new SDK? Are the supposed 25% improvements just automatic?

More info please, you’re letting us down here IMO.

Thank you for your patience, we will be updating the matrix soon, with Turing GPUs we have added enhancements to our NVENC/NVDEC as you might have read in Turing architecture white-paper:

[url]https://www.nvidia.com/content/dam/en-zz/Solutions/design-visualization/technologies/turing-architecture/NVIDIA-Turing-Architecture-Whitepaper.pdf[/url].

Some of these features will require a new Video Codec SDK - please stay tuned - we will keep the forum posts updated once the new SDK features and release schedule is made available.

Any updates on Turing?

Dear ThomasK,
I’ve downloaded the Video_Codec_SDK_7.1.9 ,and I want to run the example inside :NvDecodeD3D9 on linux.
How can i achieve this? Is there any dependency except for the cuda toolkit and cuda drive?

Hello,

Thanks for posting in the DevTalk Forums. May I ask why you are using SDK 7.1? That’s a very old SDK release. The current public SDK is 8.2.

You cannot run NvDecodeD3D9 on Linux. That’s not possible. D3D9, by definition, runs on Windows. There are nvcuvid based examples in the SDK which should run on Linux.

Regards,
Tom

When do you think you will

a) update the GPU Support Matrix to include the Turing models
b) release the new SDK?

You have released this new GPU, with supposed improvements to encoding, yet there is no way to test or utilise this.

It’s an information black hole :o)

Hi Oviano

We will have an update in the next few months.

Thanks for your patience,
Tom

Next few months is quite far away. Do you have a more specific time frame or a particular reason why the SDK wasn’t updated at the release of Turing?

And would it be possible to update the GPU Support Matrix earlier than the release of the new SDK?

Thanks in advance.

Unfortunately, that is all the information I have been given.

Best,
Tom

Thank you for the quick reply Tom.

Aside from the support matrix and SDK, there are rumors (see this thread) that the new RTX cards only contain one NVENC, instead of the two in the GP104 and GP102 dies. Do you know (or could you ask) what the number of NVENC units are in the TU102, TU104 and TU106 dies are?

Happy Friday everyone,

The NVENC/NVDEC Matrix and Video Codec SDK pages have been updated.

[url=https://developer.nvidia.com/nvidia-video-codec-sdk]https://developer.nvidia.com/nvidia-video-codec-sdk[/url]
[url=https://developer.nvidia.com/video-encode-decode-gpu-support-matrix]https://developer.nvidia.com/video-encode-decode-gpu-support-matrix[/url]

Cheers,
Tom

Thanks for the update.

Do you know when the new SDK will be released?

As of now, there is no release date available.

Ok thanks.

In the support matrix, you mention in asterisk about Turing:
“The video encoder in Turing GPUs has substantially improved quality and performance compared with Pascal. The overall encoding capacity of one NVENC in Turing is comparable to two NVENC’s in Pascal.”

Results don’t verify this, performance, at least for h264, is worse when compared to Pascal. It is actually worse by more than 50%. It might be true when you release the new SDK, but it doesn’t now. You should correct or prove this with test case scenario.

Thanks for the update, helps a lot! :)

I have two questions left:

  • Is it correct that the GeForce RTX 2080 and RTX 2070 are both based on TU104? The Turing white paper states that the RTX 2070 is based on the TU106.
  • Could the Quadro RTX 4000 be added?