Bad performance and stuttering with 375.39, but not 367.57

Debian pulled in 375.39 on testing recently and I noticed most of my games now have strong FPS fluctuations and there is stutter. Specifically Mad Max makes it very noticeable (drops to ~15 fps every 2 - 3 seconds and never manages anything like 60+, it seems).

I then tried a clean Ubuntu 16.10 install with 367.57, there performance is flawless and there are no slowdowns or stutter. Then I tried 375.39 on Ubuntu 16.10 as well, and there is the same degraded performance and stutter. Victor Vran can perform up to 50% slower than with 367.57 and this is reproducible on both Debian and Ubuntu, on a GTX 960 and a 1060.

There are others noticing this as well, I have a thread on Reddit.

I did some benchmarks. Disregard earlier claims that Tomb Raider tests 30% lower. It’s just choppier – i had the wrong setting for TressFX when testing on Ubuntu.

I believe this could point to an issue in Nvidia drivers 375 and higher, possibly also something related to changes in kernel 4.8 or 4.9, that can reduce performance by as much as 50% in some games. The last reliably fast driver I can run on kernel >= 4.8 is 367.57. It might only affect Z77 based i5 systems (because that’s all I have). 370.28 runs well on kernel 4.8 too, at least.


Update 2017-03-14: I tried 378.13 on Debian and it still has bad performance. 367.57 can no longer be built on Debian (perhaps because it doesn’t fit kernel 4.9).

Update 2017-03-15: Tested with Debian 8.7.1 so I could have the older 367.57 combined with Debian. Good performance all around. So it might be something between kernel 4.8+, the Z77 chipset and the newer Nvidia driver that is messing things up.

Update 2017-03-19: Tested it all on a GTX 960 with Ubuntu 16.10 and Debian 9, added some video capture and screenshots. Tested on Steam OS 2.0 with the GTX 1060. Links and results are in the spreadsheet.

Zesty’s nvidia 367 has lots of patches, in debian/dkms_nvidia/patches/, including:

buildfix_kernel_4.9_amd64_only.patch
buildfix_kernel_4.9.patch

Probably worth a try.

I patched as you suggested and managed to build the kernel module, but that Nvidia driver is too old for Debian 9. The version of Xorg in use there has a newer ABI, and I can’t downgrade Xorg without a gigantic mess.

I’d try giving 375.26 a go-around. It’s been solid for me, and I came from the 367.57 driver. I don’t game particularly, more in for 3D applications and simulations.

Thanks for the hint. I’ve tried it and the problems in Mad Max are the same. I verified and re-verified that with Victor Vran on Ubuntu 16.10 and 367.57 I can consistently hit 100+ fps and a very smooth feeling, whereas any 37x.x driver is choppy and unplayable and never hits that performance.

Have you tried:

Option "RegistryDwords" "OGL_MaxFramesAllowed=0x0"

(With that final digit in the range of 0, 1, 2 or 3.)

A lower number will give less input lag. The best number to use varies, depending on the game and the PC’s setup.

Hi Psy-Q,
Can I get nvidia bug report for your Ubuntu 16.10 system? Hope you are running MadMax via steam ? Is there any setting made in game? Is this issue reproduce with MSI GTX 1060 Gaming 6G or any other GPU too? can you share video recording showing frame drop? What tool you are using to check FPS ? What is repro rate , how long need to play the game? Is sync-to-vblank on/off in nvidia-settings? What is FPS if you off sync-to-vblank and run game? Test by blacklisting listing nouveau driver? What is resolution of monitor ? Is it connected via DP, DVI or any other port or dongle? Did you observe this issue on any other system also?

On Ubuntu 16.10 it

I will try this, although it isn’t input lag of any kind. It’s just that everything is sluggish and choppy and feels indirect, and that performance fluctuates a lot. I know pretty well what input lag would feel like because I built virtual pinball cabinets (based on Windows) where tweaking max buffered frames is very heavily discussed and often done, so most of us can feel in the flipper response whether there is much buffering.

But this issue is different. Mad Max’s performance rubberbands between perhaps 15 and 40 fps where it should be a solid 60+, and Victor Vran is a messy 50ish where it should be a solid 100+.

I don’t know how to explain the effect and I don’t have a 60 fps video camera where I could reliably capture footage of it. No HDMI grabber either. It would be very plain to see if I could record it somehow.

Here it is:

https://www.dropbox.com/s/9owvjnpjsxeoebm/nvidia-bug-report.log.gz?dl=0

Note that on Ubuntu 16.10, I can’t reproduce the Mad Max performance fluctations, only on Debian 9. On Ubuntu 16.10 it’s simply that e.g. Victor Vran performance is now 30% lower and less solid than with 367.57.

Yes, I’m not aware of any other way to run it.

Yes, I’m using the “Very High” graphics preset with vsync off to make results comparable.

I can get my hands on a GTX 960 to try with that.

No, I don’t have a 60 fps camera or an HDMI capture tool that would show it. I will try with a 30 fps camera to see if it’s visible there.

In Mad Max: The Steam FPS overlay.

In Victor Vran: The built-in FPS display.

In Tomb Raider: The built-in benchmarking tool.

Directly from the start of any of the games. I have save states at the start of all three. Tomb Raider doesn’t need a save since the benchmark mode is immediately available. So perhaps 5 minutes per game since the issue is visible from the very start.

I left it at the default as configured by the distribution, which is sync to vblank on, allow flipping on. I can try the tests with on and off.

Yes, the way Ubuntu packages the Nvidia drivers, nouveau is automatically added to blacklist. I verified this by hand.

1920x1080@60 Hz, DVI. Only one monitor is connected and only to this DVI port.

I think I saw bad performance on another Z77 based system in Victor Vran before (~40 fps on Linux with GTX 960 vs. ~80 fps on Windows on the same hardware). It has a different CPU, I will test there.

Hope you are running MadMax via steam ?
<<Yes, I’m not aware of any other way to run it.
I mean are are running game via steam client?

Is this issue reproduce with MSI GTX 1060 Gaming 6G or any other GPU too?
<<I can get my hands on a GTX 960 to try with that.
We’ll wait for this result

can you share video recording showing frame drop?
<<No, I don’t have a 60 fps camera or an HDMI capture tool that would show it. I will try with a 30 fps camera to see if it’s visible there.
Even screenshot is okay show max and min FPS number .

Is sync-to-vblank on/off in nvidia-settings?
<<I left it at the default as configured by the distribution, which is sync to vblank on, allow flipping on. I can try the tests with on and off.
We’ll wait for this testing result.

Did you observe this issue on any other system also?
<<I think I saw bad performance on another Z77 based system in Victor Vran before (~40 fps on Linux with GTX 960 vs. ~80 fps on Windows on the same hardware). It has a different CPU, I will test there.
Only compare with Linux OS and driver. We’ll wait for this testing result.

I’m working on the GTX 960 benchmarks now. Victor Vran is already comfirmed to be choppy and have only half the performance on Debian 9/375.39.

I can’t take screenshots of the fluctuations in Mad Max. The Steam FPS counter doesn’t show in screenshots unless I use the desktop environment’s screenshot tool, but that will defocus Mad Max and give me nothing. So I have to put a random delay for capture, but that never captures the truly low FPS. FPS are between 28 (!!!) and 54.

Instead, I took a video. The screen recorder was unable to inject its OpenGL recording library with Mad Max (game doesn’t start) so I had to do full-screen recording, which reduces performance a lot. But maybe you can see the bad stuttering and uneven performance:

I will work on the rest of the tests now, under Ubuntu 16.10.

I managed to grab a video using ffmpeg with NVENC on Debian 9. Ubuntu 16.10, unfortunately, doesn’t have a new enough ffmpeg version and the one from the PPA is missing its amd64 packages, so I can’t record anythying there.

But here the rubberbanding performance on Debian 9 is clearly visible, and you can see that we hover around the 60 fps mark instead of hitting 90 to ~115 fps like with 367.57:

Also, Victor Vran is at 30 - 50 fps instead of the 90 - 110 that I would normally get:

That stuttering/choppiness is also clearly visible.

I have updated the spreadsheet to include links to all the videos and screenshots. I’ve also received a report from Reddit user /r/Nel with a GTX 660, and he sees the reduction of ~50% performance in Victor Vran as well. The Mad Max stuttering isn’t present, however.

But it I think this is as much proof as we can get that something happened in the 37x.x Nvidia driver line and/or the 4.9 kernel that is introducing bad performance, at the very least on Z77 chipsets.

I tested with a GTX 960 and Reddit user /r/Nel was able to reproduce it with a GTX 660, the data is in the spreadsheet. I have tested every single Nvidia driver that still builds on kernel 4.9 and it’s all roughly the same degraded performance.

SteamOS 2.0 with its old kernel and driver performs spectacularly.

Please let me know if more testing is required or if there is anything else that you want me to try (e.g. compile a 4.9 kernel with a very specific .config).

We are tracking this issue under bug 200292921.

Hi All, What desktop env you are running Unity, GNOME, KDE or else? Is this issue also reproduce with bare X ?

Thanks, Sandip. I’ve updated the spreadsheet to say which desktop environment was used. It doesn’t seem to matter as it can be reproduced on XFCE, Unity, GNOME and KDE. I’ll attempt with just X.

I’ve tested one combination (Debian 9, kernel 4.9, 378.13) with bare X now and the issue is present there as well. I think testing the others would give the same results, but I can do that if you like.

For Victor Vran, the apparent driver regression comes from a fix in our driver.
The specification for glBufferStorage mentions:
‘GL_INVALID_VALUE is generated if flags has any bits set other than those defined above.’
We did not properly follow that part, and our fix started to generate GL_INVALID_VALUE. As it turns out, Victor Vran is passing flag value 0x10 to its glBufferStorage calls. This is an application bug.
Until a fix is made, there is no easy workaround that we can recommend.

Thank you, Arthur. Is the issue still being tracked for the performance fluctuations in Mad Max? They were worst so far on Manjaro/Arch (9 - 102 fps, constantly alternating between the two). It may be an application bug as well. I will run the Tomb Raider benchmark on some configurations to see if I find more.

We haven’t yet determined if the Mad Max performance regression had the same root cause or not. That seems unlikely.

Hello Arthur,

Victor Vran developer here.
Thank you for your investigation.
We found the issue on our side, and we’ll fix it in the next game update.