Stereoscopic 3D with GeForce on Linux

Hello,

I was redirected here by the support at nvidia.custhelp.com.

I just wanted to get the contents of a xorg.conf file from a laptop with successfully enabled stereoscopic 3D output via HDMI to a 3D TV.

But the support told me that stereo 3D is supported only with Quadro cards on Linux and then suggested to ask in this forum.

As the driver I’m using (370.28) has the stereoscopic option, I thought that it is supported with every card (mine is GeForce GTX 950M). When I add stereo with sudo nvidia-xconfig --stereo=12 (or 8, which should both be supported by my TV) it even tries to activate, but then gives me the error “Stereo not supported with NoScanout; disabling Stereo”. But I didn’t activate the NoScanout option, the driver seems to activate it by itself. And I don’t know how to force deactivate it.

This is why I wanted a working xorg.conf, to see how the settings should be for stereoscopic 3D to work. And I think it’s strange that stereo is working only with Quadro cards. Why not with GeForce?

Could someone please share with me a working xorg.conf With Option “Stereo” “12” (or “8”)? Thank you very much in advance.

“NoScanout” mode is enabled automatically on chips that don’t have any display connectors. For a laptop, this probably means that your displays are connected to the Intel chip and you’re using an optimus / “prime” configuration.

You’re correct that stereo is only supported on Quadro, so it wouldn’t work anyway even if your HDMI port did go to the NVIDIA GPU.

Hello Aaron,

Thanks for the reply. It’s a pity that things which are common on Windows are impossible to implement on Linux. Only one question: have you considered workarounds like side by side output for OpenGL applications? I know it works on Linux (e.g. Dolphin).

Hi mates,

the intrinsic problem on FOSS-platforms is having quadbuffering blocked by nvidia. I know it’s inbuilt in the gforce LKMs (nvidias proprietary driver). When ever it came to the question to support stereoscopics on gforces. nvidia told us “supported on Quadro” only. Well that point is comprehensible, since FOSS will support stereoscopics on its own. The really awful thing after years of “complaints” is that nvidia supports openGL-quadbuffering on the Windows platform for now, but not on the others. AFAIK you’ll need an api-key to unlock quadbuffering. Quadbuffering is essential and belongs to an platform-independent API.
Development on stereoscopic is research at all, and the most development on 3D in general came from FOSS, and it’s imperceptible not having open* on a open* platform.
So why we get a crippled API is still the “secret” of nvidia. Well, it is no secret it is evident.

nvidia should come back to roots and release their driver with full quadbuffer-support, nothing else.
As nvidia is building hardware it’s the natural right of any customer to have a standard-API for that,
otherwise their’s a inadequacy.
Supporting stereoscopics within FOSS is the task of the FOSS-Community, but the crippled API makes that impossible in the sense of being (platform-)independent.
So nvidia, please lay down your insecurity and decontrol the quadbuffering “feature” on Linux and CO even for the gforces, like your competitioners do!

Thanks in advance

10.10.2016
“Added support for NVIDIA 3D Vision 2 Stereo on Linux. This IR emitter can be used with stereo mode “10” set in the X configuration file.”
This sounds like we got quadbuffering finally, what would be very much excellent.
I’ll tinker around that and give some feedback (here) within a couple of days.

@raeten: Seems this still applies to Quadro-Cards only, here my Xorg.0.log using 367.57 with a GK104M:

[    15.393] (WW) NVIDIA(0): Stereo is only available on Quadro cards
[    15.393] (II) NVIDIA(0): Disabling stereo.

I’ve used Stereo mode 10 and have a USB-connected 3D Vision 2 stereo emitter built into a laptop (which works fine in Windows).

@olifre, yes you are right. Still no quadbuffering yet on non-Windows platforms without a Quadro.
I know Windows has got the quadbuffering even with Gforces a few years ago, so “supported on Quadry only” became “deprecated”, in fact there never has been a technical reason for that. It’s still looks like an excuse.

(well I see only one reason, but that’s really an ugly greedy reason…)

So let me rant a bit, why this is a huge problem. I’m trying to explain in detail just for documentation. We know nvidia to be (not anymore?) the “opengl company”, because their developers did a lot of brilliant work for the FOSS-community in the past (vice versa). But there has been a noticeable break on that in 2010.
Don’t get me wrong, I’m using Linux (+bsd +*unix) at all, but I am not bashing against Windows or Mac (user or developer) because we always need any kind of variety.

I am working once in while on this:

which is platform-independent. And I want to extend it this way only, but I cannot without having quadbuffering. Quadbuffer is the cheapest and most beautiful way to rewrite code for any kind of stereoscopics within 3D software. At least quadbuffering is really needed in conjunction with shutters at all. Any other kind of stereoscopic projection won’t really need that, but in sense of having a platform-independent API quadbuffering IS standard. So why blocking it on FOSS and Gforces but not on Windows and Gforces? That’s really a shame.
By the way, Quadro-Cards don’t fit all the needs in conjunction with 3D, they are very much specialised and too expensive for general 3D (+stereoscopics) applications.

Stereoscopic with nvidia shutters is “working” with my patch even with Gforces on Linux and
mode 10 (it will work with Windows and Mac as well). Mode 10 let’s my acer monitor (inbuilt IR-transmitter) sending the vsync-based signal to the glasses.
I assume the shutters will work more correctly on Windows with quadbuffering enabled, but I haven’t a feedback yet (prefering irc…).
but
Without quadbuffering I cannot ensure the correct eye (right or left) to be rendered “first”.
Results in: every dropped frame (and there are always dropped frames on any machine) results in changing the first
eye
on the IR-Signal. Which represents the virtual problem at all in conjunctions with shutters.
Normally we don’t want to take care about the swapping the buffers anymore, but without quadbuffer we have to control it more natively. I also got an external IR-Transmitter working with Linux, with the same problem.

Workarounds without quadbuffering are platform-specific, I am going to avoid these.
A “simple” solution would be a kind of using the nvapi with something like

NvAPI_GetVBlankCounter (looks like windows only, too, why?)

Reading the count of dropped frames and recalculate the “first eye” would be a solution, but as mentioned
I don’t want those (gpu) hardware and platform dependent tricks in a roof of an platform-independent engine, additionally checking for dropped frames is “costly”.
It’s about loosing 2-4 ms within render process, which is mostly too much.
Quadbuffering is the thing which is needed to have beautiful code at all.

So why the heck nvidia still blocks quadbuffering it on Linux+Gforces?
Are the openGL coders fired? Meanwhile it looks like that.

For my understanding “3D-Vision” is the bundle of hardware (and driver hooking into dx3d engines) for gameing with windows. That’s okay with me.
Supporting quadbuffering and the shutters for being just shutter-glasses is not really “3D-Vision”, we will take care about stereoscopic in open-source software, and all I (and the other mates) need is simply
quadbuffering, and I really know it’s already in but blocked artificially.
BUT:

glxgears -stereo
Error: couldn't get an RGB, Double-buffered, Stereo visual

I’ve been faithful with nvidia over the years, maybe it’s time to switch to another GPU manufactor which already has quadbuffering as standard. And yes, there’s one.
This will be my personal “solution” when the moment for ng hardware arrives, and I’m still unable to use quadbuffering.
The way making people faithful with a company is just good support, but not excluding anyone.

I still would like to know why side-by-side stereoscopic 3D is not implemented on driver level. Would be perfect for most (if not all) 3D TVs, ‘3D Ready’ projectors, and VR helmets. And doesn’t require any additional buffering. Just the same scene rendered simultaneously from 2 points of view with half of the horizontal resolution. Would work with 100% of 3D games without the need of specially adapting/patching/modding the games in any way.

1 Like

WTF?!!!

I’ve bought 2 types of glasses (3D Vision 1 and 2), I’ve bought stereo monitor Asus. I’ve got GTX 660ti video card.

And now I see, that Nvidia 3D Vision only works with Quadro cards in Linux?

WTF? Are you fucking kidding!?

Why users can’t have stereo on linux with this hardware, as they can on windows?!!!

This is nuts. Why no stereoscopic on linux with geforce ?

Common nvidia - I have windows 10 and I hate it. They only thing I’ve got it for is stereoscopic and I’d love to ditch it completely. Linux is far more stable.

PLEASE GET STEREOSCOPIC WORKING ON GEFORCE !!!