Second X server turns video singnal off on other card

Trying to run one X server on each video card. Is this even possible with Nvidia 304.108 driver?

Starting the second server causes the first monitor to lose signal.
Kill the second server and the first display reactivates.
No crashes, nothing special in logs, can send glxgears to both displays though only the last instance has active video signal

I have gone back and forth with so many xorg.conf version, trying any option that would have an effect on this behavior.

Tried all kinds of combinations of these options
Option “AutoAddGPU” “false”
Option “ProbeAllGpus” “false”
Option “Xinerama” “off”
Option “IsolateDevice” “PCI:2:0:0” # prevents X from probing additional PCI address
Option “UseEDID” “false”
Option “ConnectedMonitor” “CRT-0,CRT-1”
Option “UseDisplayDevice” “CRT-1”

Tried generating xorg.conf using nvidia-setting and using nvidia-xconfig with various combinations of these options
–no-probe-all-gpus
–separate-x-screens
–no-use-edid
–no-use-edid-freqs
–no-use-edid-dpi
–no-xinerama
–enable-all-gpus

Even tried start with additional options directly to the server
xinit – :1 -nolisten tcp -layout layout1 -novtswitch -isolateDevice PCI:1:0:0

Using xorg.conf below, I would

xinit – :0 &
edit the file to change PCI address
xinit – :1 &
First monitor would lose video signal but the X server is still running
Second monitor is up with xterm
Kill X :1 and First monitor comes back
Order does not matter, second instance disables video signal on first instance

Seem to me this should not be this complicated! :-/
In my mind X Server → Layout → Screen → Monitor → Card
1 server for each card

For Xorg guys, yea it seems wonky to have two X servers with the same inputs but I have also tried
Option “AllowEmptyInput” “On”
Without input sections and “void” drivers
I am not trying to get multi-seat, trying to create 3 independent openGL pipelines. Will be creating another topic for the gl pipeline questions.

Configuration:
SuperMicro C7P67 i7 2600
01:00.0 VGA compatible controller: nVidia Corporation [GeForce GT 440] (rev a1) IRQ 16
02:00.0 VGA compatible controller: nVidia Corporation [GeForce GT 440] (rev a1) IRQ 17
RHEL 6.2++ kernel: Linux version 2.6.32-358.el6.x86_64 (mockbuild@x86-022.build.eng.bos.redhat.com) (gcc version 4.4.7 20120313 (Red Hat 4.4.7-3)…
kernel options :ro root=/dev/sda2 nomodeset rd_NO_LUKS KEYBOARDTYPE=pc KEYTABLE=us LANG=en_US.UTF-8 nodmraid rd_NO_MD SYSFONT=latarcyrheb-sun16 crashkernel=auto rdblacklist=nouveau nouveau.modeset=0 rdblacklist=r8169 i8042.loop pci=noacpi nohpet vga=normal selinux=0 rd_NO_LVM rd_NO_DM
x2apic not enabled
irqbalance off
X.Org X Server 1.10.4

filename: /lib/modules/2.6.32-358.el6.x86_64/kernel/drivers/video/nvidia.ko
alias: char-major-195-*
version: 304.108
supported: external
license: NVIDIA
alias: pci:v000010DEd00000E00svsdbc04sc80i00*
alias: pci:v000010DEd00000AA3svsdbc0Bsc40i00*
alias: pci:v000010DEdsvsdbc03sc02i00
alias: pci:v000010DEdsvsdbc03sc00i00
depends: i2c-core
vermagic: 2.6.32-358.el6.x86_64 SMP mod_unload modversions
parm: NVreg_EnableVia4x:int
parm: NVreg_EnableALiAGP:int
parm: NVreg_ReqAGPRate:int
parm: NVreg_EnableAGPSBA:int
parm: NVreg_EnableAGPFW:int
parm: NVreg_Mobile:int
parm: NVreg_ResmanDebugLevel:int
parm: NVreg_RmLogonRC:int
parm: NVreg_ModifyDeviceFiles:int
parm: NVreg_DeviceFileUID:int
parm: NVreg_DeviceFileGID:int
parm: NVreg_DeviceFileMode:int
parm: NVreg_RemapLimit:int
parm: NVreg_UpdateMemoryTypes:int
parm: NVreg_InitializeSystemMemoryAllocations:int
parm: NVreg_UseVBios:int
parm: NVreg_RMEdgeIntrCheck:int
parm: NVreg_UsePageAttributeTable:int
parm: NVreg_EnableMSI:int
parm: NVreg_MapRegistersEarly:int
parm: NVreg_RegisterForACPIEvents:int
parm: NVreg_RegistryDwords:charp
parm: NVreg_RmMsg:charp
parm: NVreg_NvAGP:int

Simple xorg.conf

Identifier “Layout0”
Screen 0 “Screen0”
InputDevice “Keyboard0” “CoreKeyboard”
InputDevice “Mouse0” “CorePointer”
EndSection

Section “Files”
FontPath “/usr/share/fonts/default/Type1”
EndSection

Section “InputDevice”
# generated from default
Identifier “Mouse0”
Driver “mouse”
Option “Protocol” “auto”
Option “Device” “/dev/input/mice”
Option “Emulate3Buttons” “no”
Option “ZAxisMapping” “4 5”
EndSection

Section “InputDevice”
# generated from data in “/etc/sysconfig/keyboard”
Identifier “Keyboard0”
Driver “kbd”
Option “XkbLayout” “us”
Option “XkbModel” “pc105”
EndSection

Section “Monitor”
Identifier “Monitor0”
VendorName “Unknown”
ModelName “Unknown”
HorizSync 28.0 - 33.0
VertRefresh 43.0 - 72.0
Option “DPMS”
EndSection

Section “Device”
Identifier “Device0”
Driver “nvidia”
VendorName “NVIDIA Corporation”
BusID “PCI:1:0:0”

BusID “PCI:2:0:0”

EndSection

Section “Screen”
Identifier “Screen0”
Device “Device0”
Monitor “Monitor0”
DefaultDepth 24
SubSection “Display”
Depth 24
EndSubSection
EndSection

Sorry to resurrect an old thread, but I had the same issue and finally solved it after 6 months of off-and-on experiments. I want to make sure that anyone else trying to the same thing could find a solution

The issue was that the X servers were running in different VTs, only one of which can be active at a time. Here is a solution that worked for me. Run it once for each video card:

busid= # fill in busid for each card, e.g. "PCI:2:0:0"
sudo nvidia-xconfig --busid="$busid" --no-probe-all-gpus

# Change the display number (:1) for each new X server, but keep the VT number (vt1) the same
sudo Xorg -noreset +extension GLX +extension RANDR +extension RENDER \
  -sharevts -novtswitch -isolateDevice "$busid" \
  -config /etc/X11/XF86Config :1 vt1 &
1 Like