Building the AI Denoiser DLL from source

Hi!

I am building a toy path tracer in Unity, and I’d love to test drive the nVidia Optix AI Denoiser as a post-process.
I would need to have the optixDenoiser* entry points available in this DLL to call from C#.
For instance, I was able to integrate the Intel Open Image Denoise very quickly this way.

However, I can’t find clear instructions on how to build a DLL for the nVidia AI Denoiser.
The SDK samples seem to statically link all the OptiX libairies, so there is no DLL to extract.

I’m not terribly used to C++ toolchains, but can I get info on how to coerce the SDK to build a DLL so that I can use it externally?

Thanks!

The OptiX 7 denoiser functionality is accessed via the same OptiX 7 API entry point function table as the rest of the API.
The OptiX 7 API is just a set of headers and all function implementations are inside the NVIDIA display driver.
[url]https://raytracing-docs.nvidia.com/optix7/api/html/struct_optix_function_table.html[/url]

Since OptiX 7 applications are written in CUDA, you would need to decide if you want to use the CUDA Runtime API or CUDA Driver API in your host code.
The CUDA Runtime API links against the CUDA runtime library (cudart.lib or cudart_static.lib). The former would need the respective CUDA Runtime DLL for the matching toolkit version (e.g. cudart64_101.dll for CUDA 10.1) shipped with your application.
The CUDA Driver API doesn’t need that because its DLL ships with the driver.
Your application would need to include the cuda.h and/or cuda_runtime.h headers.

In summary, your DLL must setup the OptiX 7 function table, create a CUDA context and CUDA stream, and implement a few exported functions with an interface you like which are handling the input and output buffers used inside the OptiX 7 denoiser functions.
Then somehow call that from C# managed code. (Not my expertise.)

Please refer to the OptiX 7 documentation for an explanation of the OptiX 7 denoiser functions.
[url]https://raytracing-docs.nvidia.com/optix7/index.html[/url]
[url]https://raytracing-docs.nvidia.com/optix7/api/html/group__optix__host__api__denoiser.html[/url]
[url]https://raytracing-docs.nvidia.com/optix7/guide/index.html#ai_denoiser#nvidia-ai-denoiser[/url] (out of date)
(Do not use uchar3 or uchar4 input formats or normal buffers. Neither is implemented, yet. Start with float4 or half4.)

Hi! Thanks for the detailed response.

I was able to get most of the way there, but I’m hitting an issue when initializing the denoiser.

The rough sequence of calls I do is :

cudaFree(0);
CUcontext cuCtx = 0;

optixInit();

OptixDeviceContextOptions contextOptions = { };
contextOptions.logCallbackFunction = logCallback;
contextOptions.logCallbackLevel = 4;

OptixDeviceContext context;
optixDeviceContextCreate(cuCtx, &contextOptions, &context);

OptixDenoiserOptions denoiserOptions = { };
denoiserOptions.inputKind = OPTIX_DENOISER_INPUT_RGB_ALBEDO_NORMAL;
denoiserOptions.pixelFormat = OPTIX_PIXEL_FORMAT_FLOAT4;

OptixDenoiser denoiser;
optixDenoiserCreate(context, &denoiserOptions, &denoiser);

optixDenoiserSetModel(denoiser, OPTIX_DENOISER_MODEL_KIND_HDR, nullptr, 0);

optixDenoiserDestroy(denoiser);
optixDeviceContextDestroy(context);

When calling optixDenoiserSetModel, I get this error from the log callback function :

ERROR - Could not set model with with the supplied kind: 0x2323

0x2323 corresponds to the value of OPTIX_DENOISER_MODEL_KIND_HDR, but somehow it’s not happy about it.
Is there something I’m missing?

Thanks again!

Please read the last line of my post again. :-)

Normal buffer inputs to the denoiser are still not supported since OptiX 5.1.0.
The OptiX 7 API documentation about the denoiser is unfortunately not mentioning this. It’s only listed inside the OptiX 7 release notes.

Instead of OPTIX_DENOISER_INPUT_RGB_ALBEDO_NORMAL use only OPTIX_DENOISER_INPUT_RGB or better OPTIX_DENOISER_INPUT_RGB_ALBEDO.

Thanks! I totally missed that, sorry…

I managed to get my first OptiX denoised results just now, pretty happy about it :)

However, I find that HDR mode with FLOAT3 buffers results in artifacts for very high values (7.0 in both albedo and color buffers). Is that supported in the current version?

[url]https://i.imgur.com/3wBSaED.png[/url]

>>7.0 in both albedo and color buffers<<

That’s an incorrect input to the albedo buffer. The HDR denoiser handles color values in the range [0, 10000] inside the noisy RGB buffer, but the albedo buffer must not have values higher than 1.0.

Find the paragraph about the albedo buffer contents here:
[url]https://raytracing-docs.nvidia.com/optix7/guide/index.html#ai_denoiser#nvidia-ai-denoiser[/url]

This OptiX 5.1.0 based example does that by clamping the final albedo result:
(The normal buffer calculations there are just for reference and not actually used.)
[url]https://github.com/nvpro-samples/optix_advanced_samples/blob/master/src/optixIntroduction/optixIntro_10/shaders/raygeneration.cu#L161[/url]