10-bits Lookup Table

Hi everyone,

I’m using a NVIDIA Quadro 2000 graphics card linked to a 10-bits grayscale EIZO screen (GX340).
It means that the monitor is able to display 1024 different grayscale levels.

The aim of my project is to display an image on this screen using the whole range of colors (1024 levels).

As you know, Windows handle only 256 grayscale levels, so a solution to increase this range is to work with colors close to grayscale values.
My program is developed in Java and I want to avoid to use CUDA to conserve the portability of my code and also because it is possible that the graphics card will be changed in the future (by an other than NVIDIA graphics card maybe).

On the page 16 of this NVIDIA document : http://www.nvidia.com/content/quadro_fx_product_literature/TB-04631-001_v04.pdf
It is explained that an image with a range value of [0 65535] can be display using a Lookup Table 1D RGBA Texture (given by the NVIDIA graphics card) and the screen make the conversion RGBA → Grayscale.

All I want to know is : do you know which “Lookup Table 1D RGBA” is used to give 1024 colors to the original image ?

I have requested the NVIDIA support which advise my to post a message on this forum.

Thank you in advance.

Cédric.

In case you haven’t been contacted by our professional solutions business team in the meantime, here’s what I would recommend.

The GX340 monitor needs to be installed via the DisplayPort connector to support the higher color depth output. The display driver doesn’t support that via the DVI connection on that monitor model.

To display 10-bit greyscale images inside your own program, you should select a 30-bit OpenGL pixelformat in your application like described inside the document you cited, and then do the conversion from your RGB color input data to greyscale however you need it. E.g. with a GLSL shader taking the input colors and doing some mapping (luminance, intensity, arbitrary lookup table, etc.) and put the resulting greyscale value into all three RGB channels while blitting the image to the screen.

If there is any lookup table inside the driver which converts the 24-bit Windows desktop colors to the greyscale monitor, that will not be applied to your 30-bit OpenGL pixelfomat surfaces.

Hello,

Thank you for your answer.
But I’m not sure that I understood what you would say.

Just to be clear (and sure),
For you, there is no way to get a table with this model:

External Media

Because with this kind of table, it would be easy to create a LUT with 1024 RGB values that, when turned into greyscale by screen, gives the same image than the graphics card.

Thank you in advance.

Cédric.

That would mean you rely on some internal driver behavior that you do not control, which is contrary to your portability idea. To use the deep color formats you need a supporting hardware and software anyway.

The more reliable and programmable approach is to display your deep color images with OpenGL on 10-bit per channel pixelformats natively.

The OpenGL example code to select and use 30-bit color format or 10- or 12-bit grayscale monitors can be found on this site in the “Whitepaper, Sample Code, Demos” section on this site:
[url]http://www.nvidia.com/object/quadro-product-literature.html[/url]