⚠ [BUG] Transform filter CROPS instead of scaling

Thanks to developers of the GeForce driver for Linux for your efforts as for the transform-filter and nearest-neighbour scaling method features.

Unfortunately, there is a serious issue with the transform-filter feature that makes it unusable with games in its current state — games are cropped:

I tested the transform-filter feature under Ubuntu 17.04. I used the following command to upscale FHD to 4K with no blur on my Dell P2415Q (4K) monitor with GTX 650 Ti BOOST (2GB) graphics card with the GeForce beta driver 384.47:

nvidia-settings -a CurrentMetaMode="DP-1: 3840x2160_60 {ViewPortIn=1920x1080, ViewPortOut=3840x2160, ResamplingMethod=Nearest }"

The issue is that looks like OS treats the ViewPortOut resolution instead of the expected ViewPortIn as the currently selected display resolution. So the scaled image is actually CROPPED:

    [*] in OS itself, the user effectively sees the top left part (1/4 when using FHD `ViewPortIn` and 4K `ViewPortOut`) of the full rendered image. So e.g. the main Ubuntu menu at the top right corner of the screen is UNREACHABLE: you can’t even reboot without using `sudo reboot` via Terminal;

    [*] many games are cropped and generally unplayable too: for example, in Limbo, the only part of the full image visible to the user is the bottom left part that contains the two last items of the game’s menu (“Settings” and “Exit”) and the bottom half of the “Load chapter” item.

Some games work correctly (e.g. Oddworld Abe’s Oddysee New’n’Tasty).

Screenshots:

For what it’s worth, at the same time, maximized windows of regular (nongaming) desktop applications like Firefox occupy exactly the area visible to the user and don’t extend beyond that area.

What we actually need is a completely VIRTUAL resolution TRANSPARENT to OS like it takes place when using DSR, so that both OS and applications seen the VIRTUAL (ViewPortIn) resolution and worked exactly as if that resolution was selected via OS settings, with no cropping at all. Thanks.

Not sure if I posted the 384.47 beta issue I found in the right place. So I am piggy backing here.

https://devtalk.nvidia.com/default/topic/1018605/linux/384-47-drivers-8-gpus-detected-by-os-7-gpus-detected-by-nvidia-drivers/

This is a bug in the desktop environment. It sounds like it’s looking at the mode dimensions from the RandR extension when it should be looking at the “crtc” dimensions. You’ll see this same incorrect behavior with other drivers if you use the RandR transform options. I would suggest filing a bug with Ubuntu.

Thanks, Aaron. Do I understand correctly that:

    [*] a different desktop environment (such as Gnome instead of Unity used by default in current versions of Ubuntu) would not be affected?

    [*] similar nonblurry-upscaling implementation in Windows driver would not be affected?

I just checked and GNOME 3’s gnome-shell gets this right. At least the version in Arch Linux does. I would imagine other libmutter-based desktop environments (e.g. Budgie) would work too.

I don’t know about Windows. The last time I tried to just run my TV at its native 4k resolution from my laptop, Windows told my game it was running at 1920x1080, actually ran it at something smaller than that, scaled it to 3840x2160 using some horrible blocky scaling, and then claimed the scaling factor was 100%. So if you can get it to scale correctly, please let me know! :)

Complaining aside, this particular issue is Unity getting confused by RandR, neither of which exists on Windows. So if it doesn’t work, it’s for its own platform-specific reasons.

Same issue with GNOME as with Unity

I tested the feature in the recent Ubuntu 17.10 (with GNOME used as desktop environment instead of now discontinued Unity) with the latest stable nVidia Linux driver 384.90.

Unfortunately, there is exactly the same issue in games with GNOME as it was with Unity desktop environment: output image of many full-screen games is cropped.

So this is apparently not a desktop-environment-specific issue, but an issue with the nVidia driver itself or, more precisely, with the way the feature is currently implemented.

Possible solution — OS-transparent (DSR-like) scaling

To make the feature universally usable with all games, scaling should be done 100% transparently to operating system — like DSR is. So operating system should believe it outputs a real resolution (as if no meta mode was defined at all) and should know nothing about scaling details such as ViewPortOut resolution.

Instead of switching to a meta mode manually, explicitly and immediately, it should be possible to predefine the mode, and then the driver should listen for OS requests to change resolution and apply nearest-neighbour scaling automatically each time the resolution requested by OS is equal to the predefined ViewPortIn. This way, nearest-neighbour scaling would guaranteedly work with all full-screen games, regardless of how they switch to full-screen mode and when they change the resolution.

Tested games and screenshots

Some specific examples of tested games and results:

Rendered game output is cropped, the bottom-left 1/4 is only visible:

Work fine:

    [*] Oddworld New'n'Tasty; [*] Trine 2; [*] Half-Life 2. But menu items do not react to mouse cursor — probably because the real (unlike visible) position of cursor is scaling-ratio-proportionally closer to or farther from the top left screen corner.

Other specifics:

    [*] “Euro Truck Simulator 2” and “Rogue Stormers” run at 1920×1080 which is output directly to monitor, ignoring/bypassing GPU-powered scaling at all, and therefore there is blur caused by the monitor’s own scaling;
    [*] “The Cave” works fine in general, but in its settings, 3840×2160 resolution was selected instead of the real effective resolution of 1920×1080, and nothing has changed visually after switching to 1920×1080 except for the formally selected game resolution.

To be fair, there is an improvement in GNOME compared with Unity in terms of nVidia-driver meta modes: the top right corner of the main toolbar (that contains UI for e.g. turning PC off and reboot) of the OS is not hidden anymore, so the OS itself is usable. Still, this does not help with games at all.

Thanks.

I think for many of these games, the problem is similar to the problem in Unity: they’re reading RandR information directly and misinterpreting it.

Years ago, the driver actually did have a mechanism in it to lie to applications about the true screen configuration. Inside the driver, each mode had “front-end” and “back-end” display timings, where the front-end timings were what the application selected and saw as the current mode, and the back-end timings were what was actually programmed in the hardware. This lead to endless problems and confusion, because people couldn’t figure out how the driver was deciding which back-end mode to select, how to influence that selection, etc. The scaling and overscan compensation parts were also inflexible and complicated to implement, and the fact that aspects of the front-end timings such as the refresh rate were ignored just added to the confusion.

When RandR 1.2 came along, it explicitly gave control of all of that to the desktop environment. This follows a general principle with X.Org that the server and its drivers should provide mechanism, not policy. So we ripped out all of the front-end / back-end timing stuff and all of the code to lie to applications about the real configuration. The result is the much more flexible and powerful system we have now that allows you to achieve things like the “aspect scaled but locked to integers” policy without having to wait for us to add it specifically. The downside to this greater power is greater responsibility: desktop environment and application developers need to understand what RandR gives them and how to use it correctly.

Since it’s clear that many applications don’t handle scaling properly, I think it does make sense to add a mechanism to be able to hide the scaling configuration and present it in a way that the application might understand. But I don’t think it’s the driver’s job to do that because it reminds me of the bad old days of front-end & back-end timings. In my opinion, this would be better done with an LD_PRELOAD library that intercepts libXrandr calls and fakes the replies.

I should mention that your suggestion of pre-defining configurations with the filtering setting baked in actually already exists: it’s the list of MetaModes you can see in the advanced display config tab in nvidia-settings, and the corresponding list of RandR 1.1 sizes you can see when you use the old queries with “xrandr --q1”. Applications of that vintage actually behaved the way you suggest, selecting a “mode” from the RandR 1.1 list and automatically getting the MetaMode the driver defined behind the scenes.

The problem is that there was a concerted push back when RandR 1.2 was new to move applications away from the implicit RandR 1.1 stuff and to the new, more explicit RandR 1.2 interface that cuts through all of that.

Is such a library meant to be implemented by the nVidia team?

Same with the recent driver 387.34.

Same issue with the latest 390.25 under Ubuntu 17.10.1.

The feature works correctly only with games in windowed mode, works with some games in pseudo-full-screen mode (apparently just maximized, not really full-screen) and does not work at all with really full-screen games like “Euro Truck Simulator 2” and “Rogue Stormers” (in-game resolution is output directly to monitor resulting in monitor’s own blur added).

Aaron, is there some progress or are there at least plans to fix this now 8.5 months after the feature has been implemented? Thanks.

[The comment moved to a separate thread since performance drop is a different issue.]

Aaron, is there any news? Thanks.

I tested 35 games for their playability with the Nearest mode enabled via the Transform-Filter feature of nVidia driver for Linux, below are the results.

In a nutshell:

    [*]3/5 of games are playable to some extent in full-screen mode. [*]1/4 of games are playable to some extent in windowed mode. [*]1/9 of games can’t be played at all with no blur. [*]1/4 of games are cropped in full-screen mode. [*]1/7 of games output signal directly to monitor in full-screen mode and are blurrily upscaled by monitor.

Details:

4 games (11%) cannot be played at all with Transform Filter enabled — neither in full-screen mode nor in windowed mode.

13 games (37%) cannot be played in full-screen mode with Transform Filter enabled. 9 of them (26%) can be played in windowed mode. For 2 (6%) of those playable in windowed mode, the bottom area of the window hidden due to the window title-bar occupying some vertical space have critical-to-play controls in the hidden area.

5 games (14%) use true full-screen mode, i.e. output video signal directly to monitor, thus bypassing GPU-powered scaling, and are blurrily upscaled by monitor itself.

8 games (23%) are cropped in full-screen mode, so only a part of rendered image is visible. 1 of them (3%) (Syder Arcade) is cropped in full-screen mode by default, but gets working if resolution is switched to ViewPortIn via game settings before enabling Transform Filter.

2 games (6%) (F1 2015 and GRID Autosport — both ported to Linux by Feral Interactive) that support windowed mode cannot be played in windowed mode because the window is always forcedly centered in physical-resolution area while mouse cursor is hidden by the game and therefore the window cannot be moved manually to visible screen area.

22 games (63%) are playable in full-screen mode, some of them — with some limitations.

In 3 games (9%), user interface is inaccessible for mouse cursor in full-screen mode. In 2 of them (6%), mouse cursor gets working if resolution is switched to ViewPortIn via game settings before enabling Transform Filter.

At least 5 games (14%) are rendered at physical monitor resolution by default then downscaled to ViewPortIn resolution and therefore waste performance if not to switch to ViewPortIn resolution via game settings.

At least 1 game (3%) (F1 2017) has a noticeable serious performance drop in full-screen mode with Transform Filter enabled compared with regular full-screen mode, and this happens regardless of resampling method.

For details, see the table in the “Tested games” section of my comprehensive constantly-updated “Nonblurry integer-ratio scaling” article that has the first anniversary today.

Aaron, what is the progress in support for true-full-screen non-blurry scaling and especially support for the feature in the Windows version of the nVidia driver 15 months after shipping the first windowed-limited implementation in the Linux driver? Thanks.

Maybe lmiddlebrook?

Is there progress 1.5 years after first limited implementation? Or should I create a new thread in a different forum section?

?..

XFCE is affected too. (In addition of Gnome, and Unity). And there is nothing for Windows.

So we can presumably say that for a very large majority of users, the “scaling feature” introduced by nvidia doesn’t work / isn’t available.

By the end of 2018, Nvidia still hasn’t any reliable way of scaling without blur on 4K monitors (For the record these monitors are available on the market since 2013…)
sigh

Hi everyone. Any movement on this? I have recently been experimenting with integer scaling.

Related question: Is it possible to use integer scaling, but to scale the vertical and horizontal to different integers? Specifically I wonder if the following command works on my 2560x1440 monitor, and RTX 280 with driver 430.5:

nvidia-settings --assign CurrentMetaMode=“DP-2: 2560x1440 { ViewPortOut=1600x1200+480+120, ViewPortIn=320x200, ResamplingMethod=Nearest }”

This, I think, scales a 320x200 to 1600x1200, scaling the horizontal by 5 (1600 = 320x5) and the vertical by 6 (1200 = 200x6). So 1 pixel becomes a rectangular block of 5x6 pixels.

Is that correct? I can explain my motivation if anyone is interested.