Image tear with Kwin compositing and vsync ON

If you enable kwin compositing and vsync in its settings you will get image tear. Just one horizontal tear. However, more prominent than regular “vsync off” tear.
How to reproduce: use driver v310.19, kde 4.9.3, enable desktop effects (compositing opengl, vsync ON). Vsync setting in nvidia-settings doesn’t have any effect.
This can be video card dependent: I haven’t seen such tearing on 7900GS and 304.xx (310.19 and 650gtx ti now).

Please use the link below for more details:
[url]Invalid Bug ID

Part of comment 28 from there:“Other tested things for the record: glWaitVideoSync is broken (on nvidia)”


I had to change nvidia-bug-report’s extension: ‘File has an invalid extension, it should be one of jpg, jpeg, gif, tif, tiff, png, bmp, jps’

Hi AB,

glWaitVideoSync() works as intended, but doesn’t really provide a way to present in a tear-free way. The proper way to fix this on the KWin side would be to implement support for the new GLX_EXT_buffer_age extension.

Thanks,

  • Pierre-Loup

from kwin bugtrack: 307965 – Upper part of windows tears when moving it left/right ONLY in upper part of display

Whatever was the intention of “glWaitVideoSync” (apparently not the way it was utilized in compositors) - “glxinfo | grep GLX_EXT_buffer_age” is void on nvidia 310.19 and as reported in the review request on intel/mesa as well. So is “grep GLX_EXT_buffer_age /usr/share/doc/nvidia/NVIDIA_Changelog”. That is why we attempt always full repaints and swapping the buffer.
https://git.reviewboard.kde.org/r/107198/

That extension isn’t in 310.19. It will be coming in a future driver release.

It was mentioned in another thread here.

isn’t there a way to put nvidia devs and kwin devs in contact to sort this out once and for all?

My understanding is that this extension was only introduced a few months ago to the OpenGL spec and represented a collaborative effort from a variety of players to address these issues.

I’m sure KWin devs are aware of it… do you know if they’re working on an implementation?

well if you check the thread we both linked it doesn’t seem they are aware at all…

edit: just seen the Griffais post in the same thread, thnx :)

I always use kwin vsync without nvidia vsync because performance was bad when both are enabled.