xorg-server-1.17.* on optimus laptop doesn't start when nvidia opengl implementation is selected

Hi,

I’m an unhappy owner of an optimus laptop, I run it under Linux Gentoo (kernel 3.17.4).

I managed to configure the system in order to use my nvidia card for offscreen computations as it is shown here:
http://us.download.nvidia.com/XFree86/Linux-x86/319.12/README/randr14.html

After I updated my xorg-server to 1.17.* it doesn’t work (1.17.0 and 1.17.1), Xorg fails to start.
I found a workaround consisting of choosing of xorg-x11 opengl implementation instead of nvidia’s one, but I’d like to use nvidia’s opengl.

The problem is: xorg server doesn’t start, crashing with such an error output:

[  4556.379] 
X.Org X Server 1.17.1
Release Date: 2015-02-10
[  4556.384] X Protocol Version 11, Revision 0
[  4556.386] Build Operating System: Linux 3.17.4-gentoo x86_64 Gentoo
[  4556.387] Current Operating System: Linux szldlc 3.17.4-gentoo #2 SMP PREEMPT Thu Jan 22 15:14:07 CET 2015 x86_64
[  4556.387] Kernel command line: root=/dev/sdb2 video=uvesafb:mtrr:4,ywrap,1920x1080-32@100 acpi_osi=Linux acpi_backlight=vendor
[  4556.391] Build Date: 13 February 2015  03:24:53PM
[  4556.392]  
[  4556.394] Current version of pixman: 0.32.6
[  4556.397] 	Before reporting problems, check http://wiki.x.org
	to make sure that you have the latest version.
[  4556.397] Markers: (--) probed, (**) from config file, (==) default setting,
	(++) from command line, (!!) notice, (II) informational,
	(WW) warning, (EE) error, (NI) not implemented, (??) unknown.
[  4556.404] (==) Log file: "/var/log/Xorg.0.log", Time: Fri Feb 13 17:05:09 2015
[  4556.405] (==) Using config file: "/etc/X11/xorg.conf"
[  4556.407] (==) Using config directory: "/etc/X11/xorg.conf.d"
[  4556.408] (==) Using system config directory "/usr/share/X11/xorg.conf.d"
[  4556.408] (==) ServerLayout "layout"
[  4556.408] (**) |-->Screen "nvidia" (0)
[  4556.408] (**) |   |-->Monitor "<default monitor>"
[  4556.409] (**) |   |-->Device "nvidia"
[  4556.409] (==) No monitor specified for screen "nvidia".
	Using a default monitor configuration.
[  4556.409] (**) |-->Inactive Device "intel"
[  4556.409] (==) Automatically adding devices
[  4556.409] (==) Automatically enabling devices
[  4556.409] (==) Automatically adding GPU devices
[  4556.409] (==) FontPath set to:
	/usr/share/fonts/misc/,
	/usr/share/fonts/TTF/,
	/usr/share/fonts/OTF/,
	/usr/share/fonts/Type1/,
	/usr/share/fonts/100dpi/,
	/usr/share/fonts/75dpi/
[  4556.409] (**) ModulePath set to "/usr/lib64/opengl/nvidia,/usr/lib64/xorg/modules,/usr/lib32/xorg/modules"
[  4556.409] (II) The server relies on udev to provide the list of input devices.
	If no devices become available, reconfigure udev or disable AutoAddDevices.
[  4556.409] (II) Loader magic: 0x800c40
[  4556.409] (II) Module ABI versions:
[  4556.409] 	X.Org ANSI C Emulation: 0.4
[  4556.409] 	X.Org Video Driver: 19.0
[  4556.409] 	X.Org XInput driver : 21.0
[  4556.409] 	X.Org Server Extension : 9.0
[  4556.409] (II) xfree86: Adding drm device (/dev/dri/card1)
[  4556.409] (II) xfree86: Adding drm device (/dev/dri/card0)
[  4556.410] (--) PCI:*(0:0:2:0) 8086:0166:1558:6500 rev 9, Mem @ 0xf7400000/4194304, 0xd0000000/268435456, I/O @ 0x0000f000/64
[  4556.410] (--) PCI: (0:1:0:0) 10de:1292:1558:6500 rev 161, Mem @ 0xf6000000/16777216, 0xe0000000/268435456, 0xf0000000/33554432, I/O @ 0x0000e000/128, BIOS @ 0x????????/524288
[  4556.410] (II) LoadModule: "glx"
[  4556.410] (II) Loading /usr/lib64/opengl/nvidia/extensions/libglx.so
[  4556.418] (II) Module glx: vendor="NVIDIA Corporation"
[  4556.418] 	compiled for 4.0.2, module version = 1.0.0
[  4556.418] 	Module class: X.Org Server Extension
[  4556.418] (II) NVIDIA GLX Module  346.35  Sat Jan 10 20:53:39 PST 2015
[  4556.418] (II) LoadModule: "nvidia"
[  4556.418] (II) Loading /usr/lib64/xorg/modules/drivers/nvidia_drv.so
[  4556.419] (II) Module nvidia: vendor="NVIDIA Corporation"
[  4556.419] 	compiled for 4.0.2, module version = 1.0.0
[  4556.419] 	Module class: X.Org Video Driver
[  4556.419] (II) LoadModule: "modesetting"
[  4556.419] (II) Loading /usr/lib64/xorg/modules/drivers/modesetting_drv.so
[  4556.419] (II) Module modesetting: vendor="X.Org Foundation"
[  4556.419] 	compiled for 1.17.1, module version = 1.17.1
[  4556.419] 	Module class: X.Org Video Driver
[  4556.419] 	ABI class: X.Org Video Driver, version 19.0
[  4556.419] (II) NVIDIA dlloader X Driver  346.35  Sat Jan 10 20:32:18 PST 2015
[  4556.419] (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
[  4556.419] (II) modesetting: Driver for Modesetting Kernel Drivers: kms
[  4556.419] (--) using VT number 7

[  4556.424] (II) Loading sub module "fb"
[  4556.424] (II) LoadModule: "fb"
[  4556.424] (II) Loading /usr/lib64/xorg/modules/libfb.so
[  4556.424] (II) Module fb: vendor="X.Org Foundation"
[  4556.424] 	compiled for 1.17.1, module version = 1.0.0
[  4556.424] 	ABI class: X.Org ANSI C Emulation, version 0.4
[  4556.424] (II) Loading sub module "wfb"
[  4556.424] (II) LoadModule: "wfb"
[  4556.424] (II) Loading /usr/lib64/xorg/modules/libwfb.so
[  4556.424] (II) Module wfb: vendor="X.Org Foundation"
[  4556.424] 	compiled for 1.17.1, module version = 1.0.0
[  4556.424] 	ABI class: X.Org ANSI C Emulation, version 0.4
[  4556.424] (II) Loading sub module "ramdac"
[  4556.424] (II) LoadModule: "ramdac"
[  4556.424] (II) Module "ramdac" already built-in
[  4556.425] (II) modeset(1): using drv /dev/dri/card0
[  4556.425] (II) modeset(G0): using drv /dev/dri/card0
[  4556.425] (EE) Screen 1 deleted because of no matching config section.
[  4556.425] (II) UnloadModule: "modesetting"
[  4556.425] (II) NVIDIA(0): Creating default Display subsection in Screen section
	"nvidia" for depth/fbbpp 24/32
[  4556.425] (==) NVIDIA(0): Depth 24, (==) framebuffer bpp 32
[  4556.425] (==) NVIDIA(0): RGB weight 888
[  4556.425] (==) NVIDIA(0): Default visual is TrueColor
[  4556.425] (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
[  4556.425] (**) NVIDIA(0): Option "AllowEmptyInitialConfiguration"
[  4556.425] (**) NVIDIA(0): Enabling 2D acceleration
[  4556.525] (II) NVIDIA(GPU-0): Found DRM driver nvidia-drm (20130102)
[  4556.526] (II) NVIDIA(0): NVIDIA GPU GeForce GT 740M (GK208) at PCI:1:0:0 (GPU-0)
[  4556.526] (--) NVIDIA(0): Memory: 1048576 kBytes
[  4556.526] (--) NVIDIA(0): VideoBIOS: 80.28.22.00.31
[  4556.526] (II) NVIDIA(0): Detected PCI Express Link width: 8X
[  4556.526] (--) NVIDIA(0): Valid display device(s) on GeForce GT 740M at PCI:1:0:0
[  4556.526] (--) NVIDIA(0):     none
[  4556.526] (II) NVIDIA(0): Validated MetaModes:
[  4556.526] (II) NVIDIA(0):     "NULL"
[  4556.526] (II) NVIDIA(0): Virtual screen size determined to be 640 x 480
[  4556.526] (WW) NVIDIA(0): Unable to get display device for DPI computation.
[  4556.526] (==) NVIDIA(0): DPI set to (75, 75); computed from built-in default
[  4556.526] (==) modeset(G0): Depth 24, (==) framebuffer bpp 32
[  4556.526] (==) modeset(G0): RGB weight 888
[  4556.526] (==) modeset(G0): Default visual is TrueColor
[  4556.526] (II) Loading sub module "glamoregl"
[  4556.526] (II) LoadModule: "glamoregl"
[  4556.526] (II) Loading /usr/lib64/xorg/modules/libglamoregl.so
[  4556.528] (II) Module glamoregl: vendor="X.Org Foundation"
[  4556.528] 	compiled for 1.17.1, module version = 1.0.0
[  4556.528] 	ABI class: X.Org ANSI C Emulation, version 0.4
[  4556.528] (II) glamor: OpenGL accelerated X.org driver based.
[  4556.552] (EE) 
[  4556.552] (EE) Backtrace:
[  4556.552] (EE) 0: /usr/bin/X (xorg_backtrace+0x48) [0x57fa38]
[  4556.552] (EE) 1: /usr/bin/X (0x400000+0x183969) [0x583969]
[  4556.552] (EE) 2: /lib64/libc.so.6 (0x7f07e41b5000+0x34ec0) [0x7f07e41e9ec0]
[  4556.552] (EE) 3: /usr/lib64/libX11.so.6 (_XSend+0x1b) [0x7f07dc2add0b]
[  4556.552] (EE) 4: /usr/lib64/libX11.so.6 (_XFlush+0x15) [0x7f07dc2ae1b5]
[  4556.552] (EE) 5: /usr/lib64/libX11.so.6 (_XGetRequest+0x55) [0x7f07dc2b0bc5]
[  4556.552] (EE) 6: /usr/lib64/libX11.so.6 (XQueryExtension+0x3d) [0x7f07dc2a4add]
[  4556.552] (EE) 7: /usr/lib64/libX11.so.6 (XInitExtension+0x22) [0x7f07dc299202]
[  4556.552] (EE) 8: /usr/lib64/libXext.so.6 (XextAddDisplay+0x4f) [0x7f07dc067d3f]
[  4556.552] (EE) 9: /usr/lib64/libnvidia-glsi.so.346.35 (0x7f07dc5a7000+0x63017) [0x7f07dc60a017]
[  4556.552] (EE) 10: /usr/lib64/libnvidia-glsi.so.346.35 (0x7f07dc5a7000+0x4484) [0x7f07dc5ab484]
[  4556.552] (EE) 11: /usr/lib64/opengl/nvidia/lib/libEGL.so.1 (0x7f07dca39000+0x2381e) [0x7f07dca5c81e]
[  4556.552] (EE) 12: /usr/lib64/opengl/nvidia/lib/libEGL.so.1 (0x7f07dca39000+0x2417a) [0x7f07dca5d17a]
[  4556.552] (EE) 13: /usr/lib64/opengl/nvidia/lib/libEGL.so.1 (0x7f07dca39000+0x2c946) [0x7f07dca65946]
[  4556.552] (EE) 14: /usr/lib64/xorg/modules/libglamoregl.so (glamor_egl_init+0x89) [0x7f07de271399]
[  4556.552] (EE) 15: /usr/lib64/xorg/modules/drivers/modesetting_drv.so (0x7f07de8fd000+0x6949) [0x7f07de903949]
[  4556.552] (EE) 16: /usr/bin/X (InitOutput+0xbd1) [0x477771]
[  4556.552] (EE) 17: /usr/bin/X (0x400000+0x3a91b) [0x43a91b]
[  4556.553] (EE) 18: /lib64/libc.so.6 (__libc_start_main+0xf5) [0x7f07e41d6ad5]
[  4556.553] (EE) 19: /usr/bin/X (0x400000+0x2620e) [0x42620e]
[  4556.553] (EE) 
[  4556.553] (EE) Segmentation fault at address 0x0
[  4556.553] (EE) 
Fatal server error:
[  4556.553] (EE) Caught signal 11 (Segmentation fault). Server aborting
[  4556.553] (EE) 
[  4556.553] (EE) 
Please consult the The X.Org Foundation support 
	 at http://wiki.x.org
 for help. 
[  4556.553] (EE) Please also check the log file at "/var/log/Xorg.0.log" for additional information.
[  4556.553] (EE) 
[  4556.561] (EE) Server terminated with error (1). Closing log file.

Do I do something wrong or the driver/xorg is buggy?
nvidia-bug-report.log.gz (57.1 KB)

Some people (including me) have the same problem on ArchLinux since the upgrade to Xorg 1.17.

https://bbs.archlinux.org/viewtopic.php?id=193724
https://bbs.archlinux.org/viewtopic.php?pid=1503125#p1503125

Topic on gentoo forums with a bad workaround:
http://forums.gentoo.org/viewtopic-t-1010746.html

up

Disable glamor for the modesetting driver - in the section of the config file that configures the modesetting driver, add

Option "AccelMethod" "none"

It works, thanks a lot!!! But in comparison with glance, is it much less performant?

The Nvidia card is used to draw everything, so it makes no difference what modesetting is configured to use. You were using the equivalent of AccelMethod “none” until now anyway, as the modesetting driver didn’t have glamor until xorg-server-1.17.

Unfortunately that workaround is not enough for SDDM/Plasma 5, X still crashes shortly after Plasma starts loading for me.

The problem seems to bee this patch

http://cgit.freedesktop.org/xorg/xserver/commit/?h=server-1.17-branch&id=df1b401f57ad4b4925bad66684445b476562f26f

after reverting it, all works fine.

It seems the solution is here (at least it works for me)

Same problem here. X11 seems to launch but get stuck in some sort of way, as xrandr complains about an invalid magic cookie in ~/.Xauthority.

I’m typing this right now from a weston session as even with the intel drivers, I can’t get X11 to work anymore. I’m pretty much at my witts’ end here.

I think I’ll just sit in a corner and wait this one out.

The new Nvidia 346.47 should fix this problem as you can read in chanchelog.

Where exactly in the changelog do you see that? It’d be fantastic if this was fixed this quickly.

Hi,

Version 346.47 does not fix the problem for me (I’m using Arch Linux):

nvidia-settings:  version 346.47  (buildmeister@swio-display-x86-rhel47-01) 
Thu Feb 19 19:18:25 PST 2015

Xorg.log:

[    98.162] (II) Loading sub module "glamoregl"
[    98.162] (II) LoadModule: "glamoregl"
[    98.162] (II) Loading /usr/lib/xorg/modules/libglamoregl.so
[    98.167] (II) Module glamoregl: vendor="X.Org Foundation"
[    98.167]    compiled for 1.17.1, module version = 1.0.0
[    98.167]    ABI class: X.Org ANSI C Emulation, version 0.4
[    98.167] (II) glamor: OpenGL accelerated X.org driver based.
[    98.199] (EE) 
[    98.199] (EE) Backtrace:
[    98.199] (EE) 0: /usr/lib/xorg-server/Xorg (OsLookupColor+0x119) [0x5949c9]
[    98.200] (EE) 1: /usr/lib/libc.so.6 (__restore_rt+0x0) [0x7f43f250053f]
[    98.200] (EE) 2: /usr/lib/libX11.so.6 (_XSend+0x2b) [0x7f43e84770bb]
[    98.200] (EE) 3: /usr/lib/libX11.so.6 (_XFlush+0x15) [0x7f43e8477575]
[    98.200] (EE) 4: /usr/lib/libX11.so.6 (_XGetRequest+0x65) [0x7f43e847a055]
[    98.201] (EE) 5: /usr/lib/libX11.so.6 (XQueryExtension+0x4d) [0x7f43e846d5ed]
[    98.201] (EE) 6: /usr/lib/libX11.so.6 (XInitExtension+0x32) [0x7f43e8461392]
[    98.201] (EE) 7: /usr/lib/libXext.so.6 (XextAddDisplay+0x4f) [0x7f43e823038f]
[    98.201] (EE) 8: /usr/lib/libnvidia-glsi.so.346.47 (_nv016glsi+0x5f3f7) [0x7f43e88393d7]
[    98.201] (EE) 9: /usr/lib/libnvidia-glsi.so.346.47 (_nv016glsi+0x804) [0x7f43e877bbf4]
[    98.201] (EE) unw_get_proc_name failed: no unwind info found [-10]
[    98.201] (EE) 10: /usr/lib/libEGL.so.1 (?+0x804) [0x7f43e8a24ef4]
[    98.201] (EE) unw_get_proc_name failed: no unwind info found [-10]
[    98.201] (EE) 11: /usr/lib/libEGL.so.1 (?+0x804) [0x7f43e8a25924]
[    98.201] (EE) 12: /usr/lib/libEGL.so.1 (NvEglRegClientApi+0x4f36) [0x7f43e8a32866]
[    98.202] (EE) 13: /usr/lib/xorg/modules/libglamoregl.so (glamor_egl_init+0x99) [0x7f43eaf3d539]
[    98.202] (EE) 14: /usr/lib/xorg/modules/drivers/modesetting_drv.so (_init+0x29ce) [0x7f43eb5d4e9e]
[    98.202] (EE) 15: /usr/lib/xorg-server/Xorg (InitOutput+0xbcc) [0x47b63c]
[    98.202] (EE) 16: /usr/lib/xorg-server/Xorg (remove_fs_handlers+0x22a) [0x43c9da]
[    98.203] (EE) 17: /usr/lib/libc.so.6 (__libc_start_main+0xf0) [0x7f43f24ed800]
[    98.203] (EE) 18: /usr/lib/xorg-server/Xorg (_start+0x29) [0x427039]
[    98.203] (EE) 19: ? (?+0x29) [0x29]
[    98.203] (EE) 
[    98.203] (EE) Segmentation fault at address 0x0
[    98.203] (EE) 
Fatal server error:
[    98.203] (EE) Caught signal 11 (Segmentation fault). Server aborting

I still have to use the workaround :-(

346.47 does fix the issue for me as long as I specify

Option "AccelMethod" "none"

in the modesetting device section.

Tracking this issue internally in bug 1626589 . Please share dmidecode output and affected system models.

The problem described in this thread seems to be distinct from the problem that was fixed in driver 346.47. The stack trace from the crashing X server showing frames in libEGL.so.1 is somewhat suspicious: the NVIDIA libEGL library is meant to be used client-side, and not server-side. Normally, NVIDIA libEGL shouldn’t be loaded into the X server at all, but it seems that libglamoregl is doing so. That’s why the workaround of disabling glamor helps people avoid the crash.

The NVIDIA driver installer will remove the libglamor.so Xorg server extension at installation time, due to similar conflicts with libglamor.so loading the client-side libGL.so.1 into the server. If moving libglamoregl.so out of the /usr/lib/xorg directory also works around this crash, it should be pretty easy to add “libglamoregl.so” to the conflicting libraries list in the installer, though of course this does nothing to address the larger problem of loading NVIDIA client-side GL libraries into the X server, if it happens via some other mechanism in the future. (e.g. inclusion of glamor or similar functionality into the core server.)

Hey,

just for the sake of documentation: On my current Nvidia driver 349.16 (still on Arch Linux) is the workaround

Option "AccelMethod" "none"

not necessary anymore :-)