How can I make a renderer from Optix?

I’m learning to make a renderer from Optix. And I found a sample in SDK whose name is optixMeshViewer is quite helpful.
The function which I have in mind is quite similar to this sample. Load a obj file and output the picture. But how can I process the material in obj file? Can you give me some reminder? Thanks a lot.

In detail, I want to make the cow in this sample transparent. How to do it ?

For an example using a transparent material please have a look at the sticky thread in this forum with the title and link to the “OptiX advanced samples on github”, there using a PLY model as input, but that shouldn’t matter.

If you’re actually looking at implementing the OBJ MTL material model completely, that would be a little more involved (and not working for the cow.obj because that has no accompanying *.mtl file).

Thank you very much. I will check the advanced samples!

Actually, I’m new to graphic computing. I found the advanced samples quite helpful. Especially the optixProgressivePhotonMap and the optixGlass. But I’m still confusing on some question. For example what is “Progressive Photon Map”? So can you recommend me some books on ray tracing and graphic computing? Thanks a lot!

Computer graphics literature is extensive, and this forum is probably not the place to fully explore it.

For ray tracing, you could do worse than “Realistic Ray Tracing”, by Shirley and Morley, which will take you through writing a basic forward ray tracer from scratch, to the point where you can render some interesting images. After that, the PBRT book from Pharr et al is more comprehensive, and has lots of references to research papers.

Googling for “progressive photon map” should turn up a research paper for that, but I’d start with more general material first.

Thank you very much. The book Realistic Ray Tracing is quite a good choice to learn ray tracing!!

Sorry to disturb you again
I read the tutorial program and the document corresponding to the program whose name is “Nvidia Optix Ray Tracing Engines”. When I read the section 1.2, I’m quite confused about the geometric_normal and the shading_normal. I know the definition of these normals, but I can’t find out where these normals are computed.
I know these variables are used on GPU, so I think these normals are computed on CPU and then transferred to GPU global memory. But I can find the part of code doing these things. So I’m quite confused.
Can you give me some reminder? Thanks a lot!

Oh, I mean “Nvidia Optix Ray Tracing Engines Quickstart Guide” section 1.2

Please have a look at the actual sources of that optixTutorial example.
If you search for the user defined attribute semantics “geometric_normal” or “shading_normal” in there, you’ll find the assignments to them inside the intersections programs.

This is true for any attribute used inside OptiX. Attributes must be written between a pair of rtPotentialIntersection() and rtReportIntersection() calls. Other locations are not allowed.

Here’s my recommendation how to start learning OptiX.
[url]https://devtalk.nvidia.com/default/topic/915251/?comment=4800794[/url]
Just replace the old versions mentioned there with OptiX 4.1.1 and CUDA 8.0.

Thanks a lot! Actually I read the code, but I miss that part before. Sorry!

Hi
I read more about the Nvidia Optix Ray Tracing Engine Quickstart Guides, and I have two new questions.

  1. I found that the recursive functions are used. So some old graphic card (for example my GT240) can’t be used with Optix. Is it right?
  2. Section 1.10 has a figure which has a word “Interior” in the middle. I’m quite confused with this figure. Why the ray is a straight line in this figure? I mean, if this convex hull is made of some transparent material, when a ray comes into it, the refracted ray should change a direction. Am I right?
    Thanks a lot! I’m new here. Have quite a lot questions.

1.) The GT240 variants have pre-Kepler GPUs which aren’t supported by OptiX 4 anymore. You could try to use OptiX 3.9.2 instead. Don’t expect performance wonders of that board with GPU ray tracing through. I would rather recommend to upgrade to some recent Pascal GPU board which would be more fun to work with.

This has nothing to do with recursive functions. OptiX takes care to resolve recursive rtTrace() calls, provided that there is enough stack space reserved.
It’s not recommended to use recursive functions elsewhere. While that might work in some cases in the newest OptiX versions, it would prevent potential optimizations.

2.) That picture visualizes the ray’s intersection behavior of that special “convex hull” geometry example. It’s just about the geometry at that point, not the materials and light transport.

Once the closest hit has been determined, OptiX calls into the respective closest hit program of the material index given as rtReportIntersection() argument. This is where the refraction calculations would happen for transparent materials which result in the next ray’s origin and direction, and so forth.

Thanks a lot! I will read the code more carefully!

Ok. I have some new questions. Is there some examples demonstrate how to map a picture on a object in Optix? Thanks a lot!!
I really admire you Nvidia guys. Optix is quite a good product.

And is there any official API in Optix which can load Maya FBX file? Thank you!

Sure, the OptiX 3.9.x SDK contains multiple examples using textures.
The “cook” example is a classic one, the “rayDifferentials” is a newer one using bindless textures.

Simply search through the *.cu sources for “tex2D” or “rtTex2D” to find more cases.
Latter is the bindless texture equivalent and offers access to all supported texture types (1D, 2D, 3D, cube, array). Bindless textures are the recommended way to manage textures because they are not limited by the number of hardware texture units. They are available in hardware on Kepler GPUs and above. If you’re still working with a Fermi GPU, I would really recommend to update the GPU.
Also see this explanation: https://devtalk.nvidia.com/default/topic/1020023/?comment=5192770

OptiX is a ray casting SDK, not a renderer implementation. The geometry inside OptiX is freely programmable. The API itself doesn’t even know about what kind of geometry you ray trace!

Dedicated file format loaders are the responsibility of the application developer using the OptiX API.
The SDK examples built on top of the OptiX API demonstrate one way to load some simple formats like OBJ or PLY, and that’s also just an example.

PS: There is no need to quote my last answers on additional questions in the thread if there isn’t any post in between. Thanks. (I removed the quotes.)

I bought a new GPU card GTX1050Ti . So I can use Optix 4.1 now.
Thank you for your reply.
I quote your last answers because I thought if I didn’t quote, you may not notice my questions. Hahaha
I’m working on a renderer and my boss said that this renderer should load FBX files data which is output from Maya. Boss said that FBX files contain information about map and texture, but OBJ file don’t. Quite reasonable. But the problem is FBX format are not human-readable. So I had those questions. LOL

Ok, then you should use the bindless texture calls (rtTex2D and the like) for full support.

When providing version information, esp. when reporting issues, the precise number is required. (OptiX major.minor.micro). It’s normally recommended to use the latest available version.

I can’t help you on the FBX importer project other than by pointing here: [url]https://www.autodesk.com/products/fbx/overview[/url]
Get the FBX SDK and maybe its extensions SDK and good luck! Please use Autodesk’s forums in case of questions about it.

OBJ files themselves don’t contain textures, but there can be mtllib and usemtl instructions inside them which reference an accompanying *.mtl material library text file which allows to reference texture maps as well.
FBX is definitely the mightier format.