[SOLVED]Constant Medium Sphere

Hi !

I’ve seen @joaovbs96 's post https://devtalk.nvidia.com/default/topic/1045484/optix/examples-of-participating-media-volumes-in-optix-/post/5305710/#5305710 and the related blog post https://joaovbs96.github.io/optix/2018/12/24/next-week.html#ch8

I integrated the volume sphere for constant medium only (of chapter 8) in a pathtracer. The pathtracer is based on Detlef’s OptiX Advanced Introduction samples. Generally the constant medium sphere works fine, but as seen in the attachment:
1. some black artefacts pixels remain (even after HDR Denoising). What could they possbily be? How could I remove them?
2. RNG Noise seems to require long convergence times. What noise type is recommened for this? Blue Noise?
I tried DRand48 noise (as presented in the example), but even after 500 iterations still visible artefacts.
3. environment map is not visible through the medium. (although density is 0.0001f); the intersction program should simply pass through on the sphere border, shouldn’t it ?

UPDATE:
1. the “black dots” were caused by negative depth values; such values caused my engine to skip these pixels; solved now!
2. see question: https://devtalk.nvidia.com/default/topic/1060776/optix/best-noise-for-pathtracer-/
3. after adding this code :

if (hit_distance > (distance_inside_boundary - 0.01f))  return;

in hit_sphere() the env map is visible on lower density

In the closest hit (isotropic) program I use:

thePrd.radiance = make_float3(0.0f);   // instead of prd.out.emitted = emitted();
  thePrd.pdf        = 1.0f;                        // to avoid termination of the ray
  thePrd.f_over_pdf = const_color;    // instead of prd.out.attenuation
  thePrd.pos = hit_rec_p;                    // instead of prd.out.scattered_origin
  thePrd.wi = random_in_unit_sphere(make_float3(rng2(thePrd.seed), rng(thePrd.seed)) ); // instead of prd.out.scattered_direction
  thePrd.distance = optix::length(thePrd.pos - theRay.origin);   // instead of theIntersectionDistance;

changes in intersection programs:

  • hit_rec_u, hit_rec_v removed, since nowhere used in a constant medium (only one color; no texture)
  • hit_rec_normal removed, since nowhere used in closesthit
#define FLT_MAX         1e30
      distance_inside_boundary *= optix::length(ray.direction); // instead of vec3f(ray.direction).length();
      float hit_distance = -(1.0f / density) * log(rng(thePrd.seed));  // instead of -(1.0f / density) * log((*prd.in.randState)());    
      float temp = rec1 + hit_distance / optix::length(ray.direction); // instead of rec1 + hit_distance / vec3f(ray.direction).length();

in ray gen:
Min Path = 50 (so russian roulette is never applied)
Max Path = 50

Thanx for any advice!

System: OptiX 6.0.0 SDK CUDA 10.0 GTX 1050 2GB Win10PRO 64bit (version 1809) device driver: 431.36 VS2017/VS2019 (toolkit v140 of VS2015)

The intersection program doesn’t pass through anything, it just calculates the surface intersection.
Passing through something needs to be handled by the closest hit program, which would need to set the proper ray origin and direction for the continuation ray in case of a transmission.
My OptiX Introduction examples handle that for absorption already. For some atmospheric volume you would use a specular_reflection_transmission BSDF with IOR 1.0f to not get any refraction effects.

Actually, if you already have volume scattering implemented as shown in the magenta cube, the atmospheric effect would need to do exactly the same, just with lower density.

For nested scattering volumes you would need to enhance the material stack by adding another float4 array to hold the scattering coefficient (float3) and phase function bias (float). Both the absorption and scattering coefficients together would build the extinction coefficient affecting the path throughput. The radiance would need to be calculated with in-scattering. You should have both already.

Having the camera position inside that atmospheric volume needs to be handled explicitly or the material stack breaks.

Direct lighting of these volume path vertices is a little involved. If the light is outside the volume, the surface color needs to affect the incoming light, which means its material surface needs to be evaluated. Note that nested materials would also mean geometric lights inside a volume.

Thank you very much for your answer.

My wording was inaccurate concerning “passing through”. Sorry.

yes, that is true, but I found this method in that post and it seemed a bit simpler than what I did. But obviously it needs additional handling which then simply ends up in the same handling I already have.

A big difference anyway is, that this one has one primitive for the entire sphere. In my other implementation such a big sphere would use a lot of triangles in a GeometryTraingles object. So I thought using this intersection program for one primitive would be much faster than using a high tesselated sphere mesh.
And my volume-scattering method also uses additional calls of rtTrace with AnyHit programs, but in the blog @joaovbs96 says:
[…]. As dhart mentions in the OptiX forums, ‘AnyHit shaders give you potentially out of order intersections for surfaces that you register with OptiX, which can be useful for visibility queries and some transparency effects and things like that, but they may not be the ideal mechanism for traversing your volume data.’ Further into the discussion, he suggested me to combine the boundary hit and the volume hit in the same intersection program, and that did the trick![…]

UPDATE:
since I no longer use the albedo buffer for the denoiser the result improved a lot (see screenshot)