Single Buffering in Optix?

Hi, when I read through the tutorial in optix, I wish to experiment it with single buffering for some research issues(original code makes it double buffering).

It looks like simply change the code into glutInitDisplayMode(…GLUT_SINGLE) and change the function swapBuffer() into glFlush() doesn’t work. (The output still looks like it is double buffering since there are no flickers or tearing in the image.)

Does optix allow user to do it in single buffering? Or directly draw on the front buffer and display it instead of any buffer swapping? Or maybe I didn’t do something right yet to make single buffering work, would somebody please point out what need to be done? Thanks!

Which OS and graphics board?

If this is any of the Windows Vista, 8, or 10 OSes then there is always the desktop compositor active which will present the image as a whole even when rendering front buffered, so the behavior you see should be expected.

OptiX doesn’t know about any windowing system or front or back buffering. This is all related to the API you use to display the image.

With the compositor active tearing should only happen if you disable “vertical sync” in the NVIDIA control panel (under “Manage 3D Settings”). For single buffering that might be enough. For double buffering you’d need to make sure to pick a pixelformat with PFD_SWAP_EXCHANGE flag set.
Under Windows 10 you probably need to run in full-screen exclusive mode to get rid of any Windows OS compositor synchronization. For OpenGL that means a single window with client area == desktop size.

Hi, I’m using Windows 10 and GEFORCE GTX950M with Optix 4.1.1 & CUDA 8.0. (building the project using MSVS 2015)

It looks like that (at least on my laptop) only disabling “vertical sync” doesn’t make single buffering work yet. Since the tutorial.cpp (and also some of other examples in the SDK) uses glut to create window, by running full-screen exclusive mode, I’m not sure what it means exactly and it seems that glutFullScreen() doesn’t do the trick (I’m sure something I did wrong here but not sure how to fix this).

Double buffering works fine right now (although glut hides the detail of PIXELFORMATDESCRIPTOR) without specifically picking pixelformat with PFD_SWAP_EXCHANGE flag.

Since I wish to see what happens as soon as the color of a pixel is calculated, single buffering is preferred rather than using double buffering (and going to do some other stuffs based on this). Could you please point out what might the issue causing single buffering not working, or is there might be an alternative way to do this? Thank you.

First, single buffering should be working, it’s just not behaving like you intend to view it.

With any desktop compositor in current OSes there is no way to write individual values to the surface you’re currently viewing on the monitor. Means the display scan-out does not happen on the surface you’re rendering to, not even in single buffered pixelformats.
Instead everything goes into a backing storage first and is then displayed during a “present” operation of the compositor, which normally happens during a vertical blank to become tear free. It would be extremely inefficient to display individual pixels this way.
The last time this worked more directly was under Windows XP which didn’t have a desktop compositor like current OSes beginning with Windows Vista.

Also OptiX outputs results into CUDA buffers. Those cannot be displayed directly on screen but need a detour through some graphics API to make it to the display, for example as a texture blit in OpenGL as used inside the OptiX SDK examples.