Unrecognized FlatPanelProperties property "Scaling"

Debian Sid, Kernel 3.10, nVidia 325.15, 64bit
/var/log/Xorg.0.log:

(**) NVIDIA(0): Option "FlatPanelProperties" "Scaling = Aspect-Scaled"
(WW) NVIDIA(0): Unrecognized FlatPanelProperties property "Scaling";
(WW) NVIDIA(0):     ignoring.

Any ideas?

Looks like a bug since the documentation clearly says it’s a valid configuration:

Option "FlatPanelProperties" "string"

    This option requests particular properties for all or a subset of the
    connected flat panels.

    The option string is a semicolon-separated list of comma-separated
    property=value pairs. Each list of property=value pairs can optionally be
    prepended with a flat panel name and GPU specifier.

...

    Recognized properties:

       o "Scaling": controls the flat panel scaling mode; possible values are:
         'Default' (the driver will use whichever scaling state is current),
         'Native' (the driver will use the flat panel's scaler, if possible),
         'Scaled' (the driver will use the NVIDIA GPU's scaler, if possible),
         'Centered' (the driver will center the image, if possible), and
         'aspect-scaled' (the X driver will scale with the NVIDIA GPU's
         scaler, but keep the aspect ratio correct).

...

    Examples:

        Option "FlatPanelProperties" "Scaling = Centered"

    set the flat panel scaling mode to centered on all flat panels.

Internal bug 1363966 is filed on this.

The Scaling option was removed when the front-end / back-end timing stuff was removed, but it looks like the documentation wasn’t updated at the time. The documentation should be up to date in the next release. Sorry about that.

Finally - here’s the reason why Kepler GPUs have significant tearing.

However, how are we supposed to configure Scaling? Or you have totally removed this feature? It’s still available in Windows though which is kinda puzzling.

Scaling is still there, it’s just now configured explicitly via RandR 1.2 or via the ViewPortOut and ViewPortIn attributes in the MetaMode string. Scaling should have nothing to do with tearing.

I confess it’s a bit off topic for this thread, but anyway …

Tearing … good point.

Aaron, any chance we will receive a feedback on this?
[url]Screen/Video tearing 7xx(Kepler), 9xx(Maxwell), 10xx(Pascal) in almost all applications, including desktop - Linux - NVIDIA Developer Forums
[url]EVGA GeForce GTX 660 Ti Problems! - Linux - NVIDIA Developer Forums
It didn’t happen same configuration with my 460 GTX, only since I replaced it by a GTX 660ti

I would be really thankful if finally some one can spend some words…

No, I don’t have suggestions there. Those threads are kind of all over the place so it’s hard to say. I should note that without an OpenGL-based composite manager, X rendering is not expected to be tear-free in general. VDPAU can do tear-free rendering when it’s using the overlay presentation queue, but Xv and VDPAU with the blit presentation queue just do a best-effort wait for vblank and can still tear, for example if your video mode’s vblank region is very small and the system can’t quite get the rendering ready in time. Systems with high resolution displays using reduced blanking timings are becoming more common, so that might explain why people are seeing more tearing.

You could try enabling the ForceCompositionPipeline and/or ForceFullCompositionPipeline options in the MetaMode to see if that helps.

This issue affects only Kepler based GPUs, so why all of a sudden tear-free experience is still readily available in pre-Kepler GPUs?

What is that has changed so drastically in this hardware? Maybe Kepler lacks some hardware synchronization features?

I can swap my GTX 660 with 8800 GT and tearing will be gone immediately. On the very same PC.

There’s something you don’t tell us and I wonder what that is.

Big thank’s for the reply, Aaron!

Upfront I want to confirm again what birdie said - I had an 460GTX before exactly same PC / OpenSuse 11.4, that time same driver, no tearing ever. As I replaced only the graphics board by a GTX 660ti this tearing started in flash videos and e.g. vlc with xv output. VDPAU still worked w/o tearing in mplayer.

Ok here the good news: in both cases, Success !!!
using
ForceCompositionPipeline = On or
ForceFullCompositionPipeline = On
tearing is gone!
(I tested it briefly in vlc +vo=xv with a very good test movie / DVD; it simply worked)

So thank you very much! I didn’t know about these two options can be set.

A working solution now!!!

Ok and here is the P.S: I also suspect sth changed specific to Kepler architecture, and worth to mention is, that the “tearing” looked very special, like a checkerboard. I just added the old screenshot here again - and hope a new driver may fix this at all.


See what I mean?

To add: I benchmarked with Unigine Valley and this of course has a significant impact on the performance ~30% less. So it’s only a workaround for watching videos. Happily games (openGL) are NOT affected by tearing for me.

Kepler and earlier GPU architectures are dramatically different. Since, as I mentioned, the sync-to-vblank mechanism for Xv and VDPAU blit presentation queue are best-effort, small changes in timing anywhere in the system can cause tearing to appear or disappear. That’s why I can’t speculate as to why, on your particular system, one GPU behaves differently from another.

Sorry if I may sound a bit stubborn…

Partially agreed. Tearing and vsync on X and KDE etc. is a difficult task. But I had a long history of Nvidia cards, starting by Riva TNT2, Geforce2, 4600, 5600XT, 8800 GTX, 460 GTX and now 660 Ti. The last three ones on the same PC. All before mentioned modells I never seen this. Videos were fine. And, it makes me wonder if it’s GPU specific, why my 8800 GTX and 460 GTX just “hit” the perfect timing for vsync nor does the 660 Ti now.
In addition, if you look at the photo I provided, it just looks bit weird, not the “usual pattern” that happens for tearing (usually it is a single line). With Kepler it is a broader band with some squares on it.
I believe Kepler is a very good and complex architecture. But from what it seems this is still specific to Kepler and to improve on this topic would be in the interest for both, user and Nvidia.

So maybe someone at Nvidia consider to have a closer look into that … it is just my suggestion. I would appreciate that and of course deliver some (hopefully) reproducable test cases.

Anyway, thanks for the hint above.

So far, the GPU has been solely responsible for tearing here. From your words I can deduce that NVIDIA drivers don’t have any special hacks or tricks to force sync-to-vblank in software mode (non-OpenGL) and it was just a hardware feature of the older GPUs, and for some reasons the hardware part of the GPU which is responsible for ensuring a tear free experience is just not present in the Kepler architecture. It’s really sad, ‘cause I refuse to run an OpenGL compositor since I’ve never seen them to be completely reliable (Windows’ dwm.exe on the contrary has never crashed for me - but there’s no way I can run it Linux even under Wine).

Thank you for your comments.