DSI display

Hi all.

I’m currently trying to make the DSI output work on a custom TK1 board. I’m using a dsi2lvds bridge (sn65dsi84) which is connected on the DSI-0 output. The sn65dsi84 is properly set up and works fine and I can see the test video image from the chip. To output this test image you have to set both DSI and LVDS parameters and it needs the DSI clock, which is divided in order to create the LVDS clock. That means that the output clock from the DSI seems to be ok.

To make it work I’ve had to set the .dsi2lvds_bridge_enable = 1 in dsi_p_wuxga_10_1_pdata struct in arch/arm/mach-tegra/panel-p-wuxga-10-1.c. I’ve also force enable it in __tegra_dc_dsi_init() in drivers/video/tegra/dc/dsi.c

In dsi_p_wuxga_10_1_modes in arch/arm/mach-tegra/panel-p-wuxga-10-1.c, I’ve set the .pclk to 432000000, which is the 432MHz for the DSI clock. This clock drives the sn65dsi84 and I’ve programmed the divider to 1/6, so the LVDS clock ~72MHz. That seems to be working.

With my settings if I enable the test pattern in the sn65dsi84 with the command:

i2cset -y 1 0x2C 0x3C 0x10

I get the pattern.
When I’m probing with xrandr I get:

# xrandr  
Screen 0: minimum 8 x 8, current 8 x 8, maximum 16384 x 16384
DSI-0 connected primary (normal left inverted right x axis y axis)
   1920x1080     59.97 +

I don’t get any output in the screen though.

If I update the display settings, I get this message

xrandr -d :0 --output DSI-0 --mode 1920x1080
[ 1566.131724] tegradc tegradc.0: DSI: initializing panel p_wuxga_10_1
[ 1566.131838] p,wuxga-10-1 panel dt support not available
[ 1566.234193] tegradc tegradc.0: nominal-pclk:135666000 parent:406500000 div:3.0 pclk:135500000 134309340~147875940 type:2
[ 1566.290822] tegradc tegradc.0: DSI pad calibration done
[ 1566.344352] tegradc tegradc.0: dsi entered ULPM!

If you see there, for some reason the clock is set to 135MHz, although is set to 432MHZ in the driver.
Also during the kernel boot, I get this log

[    1.322918] tegradc tegradc.0: DSI: HS clock rate is 407500
[    1.323895] tegradc tegradc.0: DSI: initializing panel p_wuxga_10_1
[    1.324222] p,wuxga-10-1 panel dt support not available
[    1.426616] tegradc tegradc.0: nominal-pclk:432000000 parent:432000000 div:1.0 pclk:432000000 427680000~470880000 type:2
[    1.515867] tegradc tegradc.0: DSI pad calibration done
[    1.519821] tegradc tegradc.0: DSI: Enabling...
[    2.125728] tegradc tegradc.0: DSI: Enabled
[    2.128922] tegradc tegradc.0: dc.c probed
[    2.129644] tegradc tegradc.0: nominal-pclk:135666000 parent:406500000 div:3.0 pclk:135500000 134309340~147875940 type:2
[    2.151126] Console: switching to colour frame buffer device 240x67
[    2.171460] tegradc tegradc.0: fb registered

Some debug prints are mine, so you’ll not find them in the source. Anyway, there it seems that the first time the correct clock is used, which is 432MHz and then in every tegra_dc_update_mode() which is called from drivers/video/tegra/dc/mode.c, a different clock is used. I haven’t set that clock. I don’t know from where it comes from. In the tegra_dc_program_mode() function which is called from tegra_dc_update_mode() to update the clocks, there is a terrifying comment which says

/* TODO: MIPI/CRT/HDMI clock cals */

This is right before the nominal-pclk and pclk is printed. Should I worry about that? Is that means that the drivers are incomplete?

The bottom line is that I can’t get the DSI work in the OS. Is there something I’m doing wrong? Am I missing something here?

Finally, this is the xorg.conf file I use.

# Copyright (c) 2011-2013 NVIDIA CORPORATION.  All Rights Reserved.

#
# This is the minimal configuration necessary to use the Tegra driver.
# Please refer to the xorg.conf man page for more configuration
# options provided by the X server, including display-related options
# provided by RandR 1.2 and higher.

# Disable extensions not useful on Tegra.
Section "Module"
    Disable     "dri"
    SubSection  "extmod"
        Option  "omit xfree86-dga"
    EndSubSection
EndSection

Section "Device"
    Identifier  "Tegra0"
    Driver      "nvidia"
    Option      "AllowEmptyInitialConfiguration" "true"
EndSection

Section "Monitor"
    Identifier "DSI-0"
EndSection

Section "Monitor"
    Identifier "HDMI-0"
    Option     "Ignore"
EndSection


Section "ServerLayout"
  Identifier "Main Layout"
  Screen     0 "Screen 1"
  Screen     1 "Screen 2" RightOf "Screen 1"
  Screen     "Screen 3" Relative "Screen 1" 2048 0
EndSection

Section "Screen"
    Identifier "Screen0"
    Monitor    "DSI-0"
    Device     "Tegra0"
EndSection

Section "Screen"
    Identifier "Screen1"
    Monitor    "HDMI-0"
    Device     "Tegra0"
    Option     "Ignore"
    SubSection  "Display"
        Modes  "1920x1280"
    EndSubSection
EndSection

Section "ServerLayout"
    Identifier   "X.org Configured"
    Screen       "Screen0"
EndSection

And this is me tegra_dsi_out struct

static struct tegra_dsi_out dsi_p_wuxga_10_1_pdata = {
	.controller_vs = DSI_VS_1,
	.dsi2lvds_bridge_enable = 1,
	.n_data_lanes = 4,
	.video_burst_mode = TEGRA_DSI_VIDEO_NONE_BURST_MODE_WITH_SYNC_END,

	.pixel_format = TEGRA_DSI_PIXEL_FORMAT_24BIT_P,
	.refresh_rate = 60,
	.virtual_channel = TEGRA_DSI_VIRTUAL_CHANNEL_0,

	.panel_reset = DSI_PANEL_RESET,
	.power_saving_suspend = true,
	.video_data_type = TEGRA_DSI_VIDEO_TYPE_VIDEO_MODE,
	.video_clock_mode = TEGRA_DSI_VIDEO_CLOCK_CONTINUOUS, //TEGRA_DSI_VIDEO_CLOCK_TX_ONLY,
	.dsi_init_cmd = dsi_p_wuxga_10_1_init_cmd,
	.n_init_cmd = ARRAY_SIZE(dsi_p_wuxga_10_1_init_cmd),
	.pkt_seq = panasonic_1920_1200_vnb_syne,
};

Thanks in advance!

I don’t know much about DSI, but this line caught my attention:

tegradc tegradc.0: dsi entered ULPM!

…I’ve seen references which may mean “ultra low power mode” for ULPM. I’m wondering if things worked, but it dropped back clock and related function because of some sort of standby/battery-saver mode. If that is the case, then there may also be some sort of file in “/sys” to exit that mode.

Hi linuxdev.

Yes indeed ULPM means ultra low power mode. It’s actually a state that is also called LP-11 and it’s part of the DSI protocol. Therefore, I believe it’s not really a suspended mode. Also I’ve put this debug message by my self, in order to see that the DSI goes to LP-11 state before upload the configuration to the sn65dsi84 via I2C. I’m on the same train also; I don’t much about DSI, I only know that I have to bring up the hardware…

I’ve red several documents that explain this LP-11 state like this one (http://ti.tuwien.ac.at/cps/teaching/courses/networked-embedded-systems/materials/Renesas%20R61523_101_091225.pdf)

The problem now is that I really don’t know what breaks the chain. I’ve seen the test image from the sn65dsi85, which means that at least the dsi-to-lvds and the lvds panel work as expected, as also the dsi clk which feeds the bridge IC. And now I’m struggling with the kernel as I believe that probably it’s something in there.

I haven’t seen anyone got the DSI working, so I’m little skeptical about if it’s fully implemented in the drivers. I’ve only red that it’s supported, but no proof or any example.

Hi Dimtass,

I think we are on same page.

Can you help me on DSI Driver on Tegra K1? I have tried to integrate AUO MIPI LCD B101UAN01.7 into our customized system based on TEGRA K1 CPU. So far we have checked everything around connection to CPU and its schematics. It seems everything is correct. For the LCD, it has ORISE TECH OCT3108B-HV161 MIPI IC and it is different than usual MIPI LCDs. However, I have contacted with the technical team of AUO and they said this lcd passive type of MIPI LCD. There is no need of initiating code and special sequence. I have enabled TK1 DSI configuration on kernel and changed the values in many different ways according to the Technical Reference Manual of TK1. There is no success at all.

I hope that you can tell us what we do wrong.

Thanks,

TD

Hi TulgaD,

Well, it’s a bit complicated, so I don’t know if I can help, because we may not have the same issue.
I’ve managed to bring up the whole DSI-to-LVDS chain, but the thing is that I haven’t used a DSI monitor but a LVDS monitor with a DSI-to-LVDS bridge. This made the things a bit easier, because I could debug the monitor interface as the bridge supported a test image without the need of having the DSI interface fully up and running. Therefore, first I got sure that the monitor is working right with the bridge and then starting to debug the DSI interface.

I have to dig in and remember what I’ve done in there. But in the meantime, how did you enabled the DSI, what OS you’re using and what’s your xrandr output?

Hi Dimtass,

I think we are on same page because LCD has bridge IC which is MIPI to LCD TIMING Controller data (OCT3108B - HV161). I believe that this IC works pretty similar way with bridge mipi to LVDS IC.

Also, I am sure that this LCD works fine from another MIPI source because it is used in ASUS MEMO PAD TABLET.

I use android lollipop kernel version is 3.10.33 and enabled the DSI from Device Tree Source. Also AUO lcd manufacturer tells this LCD does not require any initial commands and sequence to light the display.

Current situation is that I have disabled every code relating to the eDP/LVDS on the kernel. Do you think this affects MIPI video source. I think there is no video data coming from CPU.

In above your log record, are DSI: Enabling… and DSI: Enabled your debug message? Where are they coming from? Which function?

Thanks,

tbh, I’m not familiar with your setup. In case of the dsi to lvds, there’s a file in the kernel in /drivers/video/tegra/ds/dsi2lvds.c that handles the bridging. That file is needed to do the bridge configuration via i2c. I’ve heavily edited this file for my needs, but if you don’t need any configuration then maybe you don’t need that.

The DSI interface is enabled in drivers/video/tegra/dc/dsi.c in function tegra_dc_dsi_enable() by this call: dsi->out_ops->enable(dsi);

Also, you need to set the correct panel and dsi_instance in arch/arm/mach-tegra/board-ardberg-panel.c in the function ardbeg_panel_configure(). So, in the ‘switch (board_out->board_id)’ I’ve used the default case to set the panel to dsi_p_wuxga_10_1 and the dsi_instance to DSI_INSTANCE_0.

default:
		panel = &dsi_p_wuxga_10_1;
		dsi_instance = DSI_INSTANCE_0;
		tegra_io_dpd_enable(&dsic_io);
		tegra_io_dpd_enable(&dsid_io);
		break;

In arch/arm/mach-tegra/panel-p-wuxga-10-1.c, I’ve changed the DSI_PANEL_RESET definition to 0. And then
in static struct tegra_dsi_out dsi_p_wuxga_10_1_pdata I’ve set the .video_clock_mode to TEGRA_DSI_VIDEO_CLOCK_CONTINUOUS, so the DSI is forced to output data continuously and don’t get to inactive state. This might do the difference for you. Also, in the same file you need to set the correct parameters in the ‘static struct tegra_dc_mode dsi_p_wuxga_10_1_modes’. There you need to set the .pclk frequency (I’ve set it to 432000000) and the rest of the parameters. You’ll find these in the display datasheet. .h_active & .v_active is your active resolution but the porches are in your datasheet.

I don’t remember if the device-tree is really implemented in the tegra kernel, but I think that everything is done in the board files that I’ve described above.

For sure, you need to set the correct panel and dsi_instance, in the arch/arm/mach-tegra/board-ardberg-panel.c as I’ve described above.

I hope that helps a bit.

Also, the “DSI: Enabling…” and “DSI: Enabled” are custom messages. I’ve removed them but I believe they were in the dsi2lvds.c file for the bridge, because I had to enable the bridge with some gpios.

Thanks for the quick reply. Surely it helps a lot. What is your kernel version?

I’m using L4T 21.5 for TK1.

That is 3.10.40

Hi Dim,

How did you set the DSI is primary since HDMI is primary display on jetson TK1? Either way, how did you activate DSI? I think that configuration is correct on kernel level. Only thing is i can’t switch between hdmi and dsi screens.

Thanks,

TD

I used the xorg.conf from the L4T Kernel, so both hdmi and dsi are register in X11.
Then I used xrandr to switch between two monitors, enable mirror mode or expand to both monitors.

If you run the xrandr command in the terminal you should see two displays.

For example, this enables the DSI-0 as a primary.

xrandr -d :0 --output DSI-0 --mode 1920x1080

You may want to also set it as a primary. See the xrandr help for the options you have.

In the command line, first do this:

export DISPLAY:=0

Then, for DSI-0 as primary

xrandr --primary --output DSI-0 --mode 1920x1080 --output HDMI-0 --off

HDMI primary:

xrandr --primary --output HDMI-0 --mode 1920x1080 --output DSI-0 --off

Mirror mode (DSI-0 as primary):

xrandr --primary --output DSI-0 --mode 1920x1080 --output HDMI-0 --mode 1920x1080 --same-as DSI-0

e.t.c.

hi,dimtass
I used the sn65dsi85 chip on the TX1. How can I mount the chip and then configure the register on the chip via the I2C?. How do I write a device tree?
Thanks

Hi wf,

I can’t tell you for sure how you do it on TX1. On TK1 to program the bridge using the I2C bus you need to edit the video/tegra/dsi2lvds.c file and more specific the structs below, according your chip and configuration.

dsi2lvds_config_clk
dsi2lvds_enable_clk
dsi2lvds_config_dsi
dsi2lvds_config_lvds
dsi2lvds_config_video
dsi2lvds_soft_reset

Each of this structs has the I2C commands that will be sent on th dsi bridge.
You also need to use the correct I2C bus in this definition:
#define DSI2LVDS_TEGRA_I2C_BUS 1

The arch/arm/mach-tegra/panel-p-wuxga-10-1.c file also has a few things that you might need to change, like dsi2lvds_bridge_enable in tegra_dsi_out struct.

In case of sn65dsi8x chips the video_clock_mode in tegra_dsi_out should be set to TEGRA_DSI_VIDEO_CLOCK_CONTINUOUS, because without a continuous clock you won’t get any image.

If you’re using a PWM backlight then make sure you set it up properly in the board files. It’s quite messy also.

Finally, regarding the device-tree, again in case of the TK1 you need to have this entry on you main dts file in arch/arm/boot/dts

dsi {
			status = "okay";
			reg = <0x54300000 0x00040000>,
			      <0x54400000 0x00040000>;
		};

Good luck!

If for some reason the TX1 kernel is different and you’re not able to find the video/tegra/dsi2lvds.c file, then a ‘dirty’ trick is to configure the sn65dsi85 in u-boot. Write a command that sets up the bridge using the I2C and run the script before load the kernel. Just make sure with a scope that after setting up the bridge in the bootloader, the kernel doesn’t send other I2C commands to the sn65dsi85.

Hi dimtass:
Now i have a mistake, can you help me?
This is the print message.Appeared "dvdd_lcd regulator get failed, dsi regulator get failed
"

[   22.779169] dvdd_lcd regulator get failed
[   22.779171] dsi regulator get failed
[[0;32m  OK  [0m] Created slice User Slice of ubuntu.
         Starting User Manager for UID 1000...
[[0;32m  OK  [0m] Started Session c1 of user ubuntu.
         Starting RealtimeKit Scheduling Policy Service...

Part of the code in the tegra210-jetson-cv-base-p2597-2180-a01.dts

host1x {
		dc@54200000 {
			status = "okay";
			nvidia,dc-or-node = "/host1x/dsi";
		};

		dc@54240000 {
			nvidia,dc-or-node = "/host1x/sor1";
		};

		dsi {
			status = "okay";
			panel-a-wuxga-8-0 {
				status = "disabled";
			};
			panel-s-wqxga-10-1 {
				status = "okay";
			};
		};
	};

Part of the code in the tegra210-jetson-cv-base-p2597-2180-a00.dts

host1x {
		/* tegradc.0 */
		dc@54200000 {
			status = "okay";
			nvidia,dc-flags = <TEGRA_DC_FLAG_ENABLED>;
			nvidia,emc-clk-rate = <300000000>;
			nvidia,fb-bpp = <32>; /* bits per pixel */
			nvidia,fb-flags = <TEGRA_FB_FLIP_ON_PROBE>;
		};

		/* tegradc.1 */
		dc@54240000 {
			status = "okay";
			nvidia,dc-flags = <TEGRA_DC_FLAG_ENABLED>;
			nvidia,emc-clk-rate = <300000000>;
			nvidia,cmu-enable = <1>;
			nvidia,fb-bpp = <32>; /* bits per pixel */
			nvidia,fb-flags = <TEGRA_FB_FLIP_ON_PROBE>;
		};

		dsi {
			nvidia,dsi-controller-vs = <DSI_VS_1>;
			status = "okay";
			panel-a-wuxga-8-0 {
				status = "okay";
				nvidia,dsi-dpd-pads = <DSIC_DPD_EN DSID_DPD_EN>;
				nvidia,panel-rst-gpio = <&gpio TEGRA_GPIO(V, 2) 0>; /* PV2 */
				nvidia,panel-bl-pwm-gpio = <&gpio TEGRA_GPIO(V, 0) 0>; /* PV0 */
				disp-default-out {
					nvidia,out-flags = <TEGRA_DC_OUT_CONTINUOUS_MODE>;
				};
			};
			panel-s-wqxga-10-1 {
				status = "okay";
				nvidia,dsi-dpd-pads = <DSIC_DPD_EN DSID_DPD_EN>;
				nvidia,panel-rst-gpio = <&gpio TEGRA_GPIO(V, 2) 0>; /* PV2 */
				nvidia,panel-bl-pwm-gpio = <&gpio TEGRA_GPIO(V, 0) 0>; /* PV0 */
				nvidia,panel-en-gpio = <&gpio TEGRA_GPIO(V, 1) 0>; /* PV1 */
				nvidia,dsi-te-gpio = <&gpio TEGRA_GPIO(Y, 2) 0>;
				disp-default-out {
					nvidia,out-flags = <TEGRA_DC_OUT_CONTINUOUS_MODE>;
				};
			};
		}

Part of the code in the panel-s-wqxga-10-1

host1x {
		dsi {
			status = "okay";
			nvidia,dsi-controller-vs = <DSI_VS_1>;
			panel-s-wqxga-10-1 {
			status = "okay";
				nvidia,dsi-lvds-bridge= <TEGRA_DSI_ENABLE>;
				compatible = "s,wqxga-10-1";
				nvidia,dsi-instance = <DSI_INSTANCE_0>;
				nvidia,dsi-n-data-lanes = <4>;
				nvidia,dsi-pixel-format = <TEGRA_DSI_PIXEL_FORMAT_24BIT_P>;
				nvidia,dsi-refresh-rate = <61>;
				nvidia,dsi-rated-refresh-rate = <60>;
				nvidia,dsi-te-polarity-low = <TEGRA_DSI_ENABLE>;
				nvidia,dsi-video-data-type = <TEGRA_DSI_VIDEO_TYPE_COMMAND_MODE>;
				nvidia,dsi-video-clock-mode = <TEGRA_DSI_VIDEO_CLOCK_CONTINUOUS>;
				nvidia,dsi-ganged-type = <TEGRA_DSI_GANGED_SYMMETRIC_LEFT_RIGHT>;
				nvidia,dsi-controller-vs = <DSI_VS_1>;;

Hi wf, for some reason I don’t get e-mail notifications.

Regarding the regulators, I think you should find in the device-tree files if the regulator chip is enabled for those peripherals. There’s an external regulator IC that the TK1 controls which peripherals are enabled. If you’re using a custom board and you don’t control the power using the regulator IC, then you can ignore these messages. If you’re using it you need to find which regulator needs to be enabled. Also, search the error strings in the kernel source code and try to debug in there.

Hi dimtass,
thank you for your patience.
My chip sn65dsi85 is external power supply.Is it possible to ignore these print messages?
The chip sn65dsi85 has been configured via I2C.Now I only changed the above code, I found tegra no dsi clk output.What files need to be configured?
Thanks

Hi dimtass,
Now the chip test image has come out. When I quit the test mode, I found the screen was black.
I think the tegra output through HDMI not dsi.How can double-screen display, HDMI and DSI?
Thanks