This is being posted on the TX2 board, but should apply equally for everything from the TK1 through Xavier. I am hoping to clarify some video display topics which are a recurring theme on the forums. This is in part a question for NVIDIA, and in part a restatement of what is already known for other users on the topic (in other words, this is purposely longer than it needs to be for simply asking a question). This particular post is about “nvidia-auto-select”, modelines, and how to pick modes via “/etc/X11/xorg.conf”. There seems to be a bug forcing nvidia-auto-select to ignore otherwise valid EDID mode selections. An EDID mode may be listed simultaneously in the same log as both rejected and valid.
The equivalent documentation on X11 general driver setup (and specific to “/etc/X11/xorg.conf”) for the desktop PCs is a good starting document. Even so, this documentation is not actually valid for Jetsons and integrated GPUs which are directly attached to the memory controller. As an example desktop PC document can be found here:
https://download.nvidia.com/XFree86/Linux-x86_64/396.54/README/
It would be good to document the equivalent of the above document’s “Appendix B. X Config Options” for Jetsons and integrated GPUs (“iGPU”). At a minimum the single appendix regarding xorg.conf could be considered as required for many people needing fine configuration details. No such document exists for embedded GPUs. Currently it is difficult to know how to do even simple configuration changes, e.g., selecting a different video mode to default to. Much of the existing document for a desktop PC could simply be ignored once stating that only EDID modes will work and that no interlaced modes will work.
Modelines
Historically, video drivers have been given a “modeline” (or many modelines) to describe a mode (or list of modes) which a monitor can use. Before the DDC wire was added to connectors newer than the 15-pin D-sub VGA connector a manually entered modeline was the only way to tell a video driver what kind of setup would work. The extra "DDC wire on HDMI, DisplayPort, and most DVI (which uses i2c protocol and provides EDID data) is what changed monitor configuration to become automated. EDID contains all of the information needed to automatically create a modeline instead of requiring manual entry of the information (e.g., a monitor’s driver disk has modelines, and this is what the ancient monitor driver floppy installs…manually editing xorg.conf with vi and some knowledge does the same thing). In the past, before DDC/EDID, one could even install a database of known monitor timings and the end user could pick from the list. EDID ended that, but despite modelines no longer requiring manual intervention the modelines are still relevant. I am limiting this question to one of “how to select video modes or modelines” in xorg.conf when the driver agrees the EDID mode is valid. No interlaced mode will be considered since the driver rejects all interlaced modes. Non-EDID modes are not considered since only modes found from an EDID query are considered valid. Manual selection of a valid EDID mode other than what “nvidia-auto-select” would pick is the main goal of this question.
To start with, if a monitor is able to communicate its EDID, then it is available as hex data via this:
sudo -s
cat `find /sys -name edid'`
exit
…if this data cannot be found, then it means your monitor (or cables or connector adapters) cannot work with EDID. The iGPU must normally have EDID. When you find this data you should save a copy somewhere for reference.
If you copy the EDID hex data and paste this into http://www.edidreader.com, then you can see everything the monitor has told your video card about. At minimum you should examine if http://www.edidreader.com says the checksum is valid. There is a predefined list of available modes for a given GPU/driver, and after getting the monitor’s EDID the driver must decide which of the modes it claimed by the monitor are within the predefined list. Something which would be of extreme value as new documentation is a list of predefined modes acceptable by each driver release on the various Jetson platforms. Without a predefined list it is possible to query an existing monitor as to what is valid, but it isn’t possible to guess from specifications whether or not a future monitor purchase will work. A method to tell the driver to dump information on all known modes, regardless of the current monitor EDID, would also suffice (the driver would then be “self documenting”).
To specify video modes on any Jetson through “/etc/X11/xorg.conf” (a persistent, server-wide configuration), the theory is that we need to add a modeline which is an exact match to a mode which is reported by EDID (if the driver says the mode is acceptable, then we should be able to select that mode via a modeline). To do this we start by asking the NVIDIA driver to tell us what modes it sees, and then to build our modeline based on this. Adding ‘Option “ModeDebug”’ to the Device section of xorg.conf accomplishes the task of asking the driver to tell us about what it sees relative to the attached monitor’s EDID. An example xorg.conf edit, with “ModeDebug” added to what is already there by default:
Section "Device"
Identifier "Tegra0"
Driver "nvidia"
Option "AllowEmptyInitialConfiguration" "true"
<u><b>Option "ModeDebug"</b></u>
EndSection
The X11 log is file “/var/log/Xorg.0.log”, where the “0” is from the environment variable “DISPLAY”…the default is normally “export DISPLAY=:0” for the first local display, but there are exceptions, e.g., Xavier seems to be “:1” (thus “:0” is a different buffer which some other software is accessing other than the current desktop…perhaps CUDA or a vertual desktop…one can have as many displays as desired so long as each has a unique context via “DISPLAY”…not all "DISPLAY"s actually go to a monitor, but all "DISPLAY"s do go to a GPU or framebuffer).
With option “ModeDebug” in the “Device” section (the “Device” is a GPU driver coupled to a GPU), the driver will explicitly log all information about all modes the HDMI cable’s EDID query results in. With no monitor (or no DDC wire) there will be no EDID information; each time a monitor is connected and detected there should be EDID data logged for that monitor, and the Xorg.0.log should reflect this. HDMI is hot-plug, and upon a hot-plug connect event EDID is processed. Upon a hot-plug disconnect the previous EDID mode (which might have been a default mode if no monitor was ever connected) will most likely be preserved. A VGA monitor without EDID could in theory use the mode which was previously present due to preservation of a prior mode.
The reason why having EDID data in Xorg.0.log is so important is that EDID data all by itself does not tell us what the driver thinks of the various modes. It is possible for a monitor to support modes beyond the range of the GPU, or vice-versa. This “ModeDebug” log will not only tell you what modes were reported, it will also tell you what the driver thinks of those modes and the technical parameters needed to manually construct a modeline the way the driver would construct a modeline. EDID data, when by itself and without the context of a driver being known, won’t tell us anything about what the driver thinks of the modes.
If you are working on your Jetson’s video configuration you should add the ‘Option “ModeDebug”’ now to the “/etc/X11/xorg.conf” file within the “Device” section (a.k.a., driver options for a specific GPU), reboot, and save a copy. Between this log and the “/sys” “edid” file hex data we now know just about everything about the monitor and its relationship to the NVIDIA driver.
To see a general table of modes your Jetson’s video will accept from the known EDID modes of your monitor, try this with your “ModeDebug” log (this is the “mode pool” summary…the final list of possibilities for a given monitor using this particular GPU and driver combination):
gawk '/Modes in ModePool for/,/End of ModePool for/' /var/log/Xorg.0.log
(see footnote [1])
Hint: The statement in the mode pool of “from: NVIDIA Predefined” is subtle, but extremely important. On a desktop PC other modes beyond predefined modes may be achievable, but with the current generation of Jetsons there are no other achievable modes with any monitor. These predefined modes are the logical intersection of what the monitor claims it can do and what the GPU/driver allows. In a multi-monitor setup there will be one mode pool for each monitor.
To see detailed descriptions of all modes reported in EDID, regardless of whether the mode is accepted or rejected, try this:
gawk '/Validating Mode/,/Mode \".+\" is (valid|invalid)/' /var/log/Xorg.0.log | less -p 'rejected|invalid|valid'
…notice that the end of the mode will say if the mode is valid or invalid.
(see footnote [2])
An example valid EDID mode from a real world monitor is:
[ 9.242] (II) NVIDIA(GPU-0): Validating Mode "1280x960_60":
[ 9.242] (II) NVIDIA(GPU-0): Mode Source: NVIDIA Predefined
[ 9.242] (II) NVIDIA(GPU-0): 1280 x 960 @ 60 Hz
[ 9.242] (II) NVIDIA(GPU-0): Pixel Clock : 108.00 MHz
[ 9.242] (II) NVIDIA(GPU-0): HRes, HSyncStart : 1280, 1376
[ 9.242] (II) NVIDIA(GPU-0): HSyncEnd, HTotal : 1488, 1800
[ 9.242] (II) NVIDIA(GPU-0): VRes, VSyncStart : 960, 962
[ 9.242] (II) NVIDIA(GPU-0): VSyncEnd, VTotal : 965, 1000
[ 9.243] (II) NVIDIA(GPU-0): H/V Polarity : +/+
[ 9.243] (II) NVIDIA(GPU-0): Mode "1280x960_60" is valid.
If you look at what a modeline is, then you’ll find this Wikipedia description tells you about the components of a modeline. See:
https://en.wikipedia.org/wiki/XFree86_Modeline#Syntax
Here is an excerpt of the Wikipedia description:
...
Modeline syntax: pclk hdisp hsyncstart hsyncend htotal vdisp vsyncstart vsyncend vtotal [flags]
Flags (optional): +HSync, -HSync, +VSync, -VSync, Interlace, DoubleScan, CSync, +CSync, -CSync
Modeline "1600x1200" 155 1600 1656 1776 2048 1200 1202 1205 1263
# (Label) (clk) (x-resolution) (y-resolution)
# |
# (pixel clock in MHz)
In the example, other than a title for the mode, the modeline can be created by reading the verbose “ModeDebug” log information in the order it appears (“xorg.conf” token “Modeline” goes in ‘Section “Monitor”’). This is a modeline representing the real world example above from Xorg logs (‘Mode “1280x960_60” is valid’):
ModeLine "1280x960_60" 108.00 1280 1376 1488 1800 960 962 965 1000 -hsync +vsync
…this example modeline, when put in the Monitor section in xorg.conf, is exactly equivalent to the EDID mode; this should work within xorg.conf if placed in the Monitor section. Note that “ModeDebug” has a comment on each modeline parameter so you know which field this is a match for.
When the NVIDIA video driver picks a mode via “nvidia-auto-select”, then this modeline is created in RAM and used for configuring the display. In order to use a mode via a modeline the modeline must exist and exactly match what EDID provides and what the driver accepts. Automatic determination of a modeline should be indistinguishable from a manually created modeline for any given mode taken from “ModeDebug”. Modelines not matching a mode in “ModeDebug” should be summarily rejected. It seems to be a bug that correctly matching modelines are rejected if they are not the mode “nvidia-auto-select” would pick.
Someone may wonder why there are more parameters than those which specifically set a mode, e.g., there are some timings related to start and end. Not all monitors start the displayable content at the exact same time as the time used for sync (especially analog monitors). There may be a need for translating/panning an image left/right or up/down, and there may be bounding box pixels in a frame which are not actually displayable (clipping). A mode is fairly distinct, but the timings used to adjust an individual monitor to clip and center correctly will differ among monitors. Two monitors of the same mode may have the details of the modeline differ.
Problems…
Ok, so barring the syntax question of how to actually set up xorg.conf to pick an EDID mode other than what “nvidia-auto-select” picks, I’ll state ahead of time that it doesn’t seem possible to disable “nvidia-auto-select”. From what I can tell we need a way to disable “nvidia-auto-select” when a valid modeline is used to select which EDID mode a given monitor will use. It seems there is a bug (or some required alternate xorg.conf syntax) where valid EDID modes set via modeline are rejected and overridden.
A sample configuration using the previously listed monitor follows. Here are the steps to try to enable “1280x960_60” (the instructions should allow reproduction of the issue for any monitor providing EDID). The mode this particular example monitor normally boots to (when not being forced into another mode) is “1680x1050”. The EDID of this monitor:
# cat `find /sys -name edid`
00 ff ff ff ff ff ff 00 5a 63 1e 59 01 01 01 01
1c 11 01 03 80 2f 1e 78 2e d0 05 a3 55 49 9a 27
13 50 54 bf ef 80 b3 00 a9 40 95 00 90 40 81 80
81 40 71 4f 31 0a 21 39 90 30 62 1a 27 40 68 b0
36 00 da 28 11 00 00 1c 00 00 00 ff 00 51 41 35
30 37 32 38 35 32 39 30 34 0a 00 00 00 fd 00 32
4b 1e 52 0f 00 0a 20 20 20 20 20 20 00 00 00 fc
00 56 58 32 32 33 35 77 6d 0a 20 20 20 20 00 ea
Pasting this into http://www.edidreader.com shows all modes are primary modes and this older monitor has no extension modes (the driver will refuse extensions…whether that is all extensions or just some of the extensions I do not know). Under “standard display modes” “1280x960_60” is not interlaced, so this mode should work. Note that horizontal of “1280” is listed, and then the aspect ratio must be used to find the vertical size: The vertical dimension for 1280 with 4:3 aspect can be computed as “1280 * (1/aspect) == 1280 * (3/4) == 960”. The mode pool verifies that 1280x960_60 is a predefined mode. The “ModeDebug” log shows the mode as “valid”.
However, there is a “catch” here…the mode is incorrectly logged multiple times when this xorg.conf ModeLine is used (same as previously stated, this is an “NVIDIA Predefined” mode):
ModeLine "1280x960_60" 108.00 1280 1376 1488 1800 960 962 965 1000 -hsync +vsync
This is particularly important because the exact quote of the mode is both rejected and validated. Different parts of the driver are in disagreement as to whether “1280x960_60” is valid. Here is a log excerpt from a single log prior to any GUI login (the login manager is present, but the window manager has not started) where the mode is shown as both valid and invalid:
[ 9.177] (II) NVIDIA(GPU-0): Validating Mode "1280x960_60":
[ 9.177] (II) NVIDIA(GPU-0): Mode Source: NVIDIA Predefined
[ 9.177] (II) NVIDIA(GPU-0): 1280 x 960 @ 60 Hz
[ 9.177] (II) NVIDIA(GPU-0): Pixel Clock : 108.00 MHz
[ 9.177] (II) NVIDIA(GPU-0): HRes, HSyncStart : 1280, 1376
[ 9.178] (II) NVIDIA(GPU-0): HSyncEnd, HTotal : 1488, 1800
[ 9.178] (II) NVIDIA(GPU-0): VRes, VSyncStart : 960, 962
[ 9.178] (II) NVIDIA(GPU-0): VSyncEnd, VTotal : 965, 1000
[ 9.178] (II) NVIDIA(GPU-0): H/V Polarity : +/+
[ 9.178] (II) NVIDIA(GPU-0): <b>Mode "1280x960_60" is valid.</b>
...
[ 9.178] (WW) NVIDIA(GPU-0): Validating Mode "1280x960_60":
[ 9.178] (WW) NVIDIA(GPU-0): Mode Source: X Configuration file ModeLine
[ 9.178] (WW) NVIDIA(GPU-0): 1280 x 960 @ 60 Hz
[ 9.178] (WW) NVIDIA(GPU-0): Pixel Clock : 108.00 MHz
[ 9.178] (WW) NVIDIA(GPU-0): HRes, HSyncStart : 1280, 1376
[ 9.178] (WW) NVIDIA(GPU-0): HSyncEnd, HTotal : 1488, 1800
[ 9.178] (WW) NVIDIA(GPU-0): VRes, VSyncStart : 960, 962
[ 9.178] (WW) NVIDIA(GPU-0): VSyncEnd, VTotal : 965, 1000
[ 9.178] (WW) NVIDIA(GPU-0): H/V Polarity : -/+
[ 9.178] (WW) NVIDIA(GPU-0): Mode is rejected: Only modes from the NVIDIA X driver's
[ 9.178] (WW) NVIDIA(GPU-0): predefined list and modes from the EDID are allowed
[ 9.178] (WW) NVIDIA(GPU-0): <b>Mode "1280x960_60" is invalid.</b>
...
I do not know why the mode is listed twice with disagreement between “valid” and “invalid”. Perhaps it is just because of the syntax used in xorg.conf (see footnote [3] for the xorg.conf used).
Remember that each time a monitor connect event is seen EDID will be processed. Starting a new instance of an X11 server could also be considered a connect event. Unfortunately I do not know why the mode was examined twice prior to any other connect event.
Once a GUI login has occurred some information can be gathered via “xdpyinfo”. Apparently the 1280x960_60 mode is allowed as a virtual desktop, but actual desktop is forced to the “nvidia-auto-select” size of “1680x1050”. I don’t really believe the virtual desktop is actually used because the desktop has no required panning to see the whole desktop, nor any clipping…the virtual and physical desktops appear to both be “1680x1050” regardless of what the driver thinks (see footnote [4] for an alias to quickly view that information). Example:
DISPLAY=:0 xdpyinfo | egrep "dimensions"
# dimensions: 1680x1050 pixels (445x278 millimeters)
egrep "Virtual screen size" /var/log/Xorg.0.log'
# [ 9.198] (II) NVIDIA(0): Virtual screen size determined to be 1280 x 960
What is the proper method of manually configuring xorg.conf for a mode which is valid in EDID? I am thinking perhaps a MetaMode token is ignored, but I do not know of another way to mark a mode for use.
Footnotes:
[1][2][4]: For convenience you might want to add this bash functions in “~/.bash_aliases” (or “~/.bashrc”):
# Footnote [1]
# Displays the mode pool from the default Xorg.0.log log file. If an argument is
# named, then this instead gives the mode pool of the named log.
function pool () {
local logname="/var/log/Xorg.0.log";
if [[ $# -gt 0 ]]; then
logname="$1";
fi
gawk '/Modes in ModePool for/,/End of ModePool for/' "${logname}";
}
# Footnote [2]
# Displays and highlights "ModeDebug" accept/reject comments from the default
# Xorg.0.log log file. If an argument is named, then this instead gives the
# "ModeDebug" accept/reject commands of the named log.
function modes () {
local logname="/var/log/Xorg.0.log";
if [[ $# -gt 0 ]]; then
logname="$1";
fi
gawk '/Validating Mode/,/Mode \".+\" is (valid|invalid)/' "${logname}" | less -p 'rejected|invalid|valid';
}
# Footnote [4]
# Displays a logged in session's idea of virtual screen size and actual screen
# size.
alias dim='DISPLAY=:0 xdpyinfo | egrep "dimensions"; egrep "Virtual screen size" /var/log/Xorg.0.log'
[3]: The example’s xorg.conf:
Section "Module"
Disable "dri"
SubSection "extmod"
Option "omit xfree86-dga"
EndSubSection
EndSection
Section "Device"
Identifier "Tegra0"
Driver "nvidia"
Option "AllowEmptyInitialConfiguration" "true"
Option "ModeDebug"
EndSection
Section "Monitor"
Identifier "Monitor0"
VendorName "Viewsonic"
ModelName "VX2235wm"
<b>ModeLine "1280x960_60" 108.00 1280 1376 1488 1800 960 962 965 1000 -hsync +vsync</b>
HorizSync 31.0 - 76.0
VertRefresh 56.0 - 76.0
Option "DPMS"
EndSection
Section "Screen"
Identifier "Screen0"
Device "Tegra0"
Monitor "Monitor0"
DefaultDepth 24
<b>Option "MetaModes" "1280x960_60 +0+0"</b>
SubSection "Display"
Depth 24
EndSubSection
EndSection