how to draw texture on obj model through optix SDK example?

Hi all,
I’m just optix/cuda beginner, I’m not sure this problem can be post here or not.
I’m working on a project which have to draw texture on 3D model with ray tracing.I modify SDK example “progressivePhotonMap”.

I read other draw texture examples like “swimmingShark” or “cook” and try to find out clue to use. However, those examples seem has different way to draw texture.

From now on, I know I have to load texture in cpp file

GeometryInstance instance = m_context->createGeometryInstance( mesh, &m_material, &m_material+1 );
instance["diffuse_map"]->setTextureSampler(loadTexture( m_context, ... );

and create TextureSampler in cuda file

rtTextureSampler<float4, 2>      diffuse_map; // Corresponds to OBJ mtl params

,and give them texcoord to draw, like this,

float3 Kd = make_float3( tex2D( diffuse_map, texcoord.x*diffuse_map_scale, texcoord.y*diffuse_map_scale ) );

However, I cannot find where the texcoord get the texture coordinate data in cuda file. It seems there should be some code like below in .cpp file

GI["texcoord"]->setBuffer(texcoord);

But I can’t find any code just like this, where dose the cuda file get the texture coordinate data?

Also, if I want to draw texture with ray tracing, is it right that i modify “ppm_ppass.cu” file like below?

float3 Kdd = make_float3( tex2D( diffuse_map, texcoord_buffer.x, texcoord_buffer.y ) );
hit_record.energy = Kd * hit_record.energy * Kdd;

Could anyone give me advise about where texcoord get the texture coordinate data, and how to match coordinate data and texture to present 3D model with ray tracing? I can’t find tutorial in google or programming guide, I really need some help or direction to reach my goal.

Hello,

First off, to see how to create a texture from an image, please take a look at the file ImageLoader.h/.cpp in sutil. It contains a function called loadTexture:

// Creates a TextureSampler object for the given image file. If filename is
// empty or the image loader fails, a 1x1 texture is created with the provided
// default texture color.
SUTILAPI optix::TextureSampler loadTexture( optix::Context context,
const std::string& filename,
const optix::float3& default_color );

Of course, you can just use this function to make things easy for you.

Second issue: how to get texture coordinates to feed into the tex2D call. This is slightly more complicated and you might want to read over the optix programming guide to learn about attributes. In brief, attributes are a mechanism for passing data (eg, texcoord and normals) from the geometry intersection program into the closest hit program. If you look at sutil’s obj_material.cu you will see the texcoord declared at the top of the file. If you look at triangle_mesh.cu you will see a similar declaration. the triangle_mesh writes into the attribute and the obj_material reads it.

So you need to declare a texcoord attribute in your ppm_rtpass.cu file (or wherever you want to use the texcoord) and then you should be good to go. The geometry used by the progressive photon map already generates the texcoord – you just need to access it.

Note that ppm_rtpass already contains attributes for shading_normal and geometric_normal.

I hope this helps.

1 Like

Hello,

Does anyone know the method which can map a jpg or png format picture onto a mesh(e.g. input from a obj format file)?

Thanks!

Ok, I have known that the texcoord comes from intersection function.
I can map a picture onto a plane now.
But I have another problem now. I want to map two different picture onto two planes.
I use the same closest_hit function from one .cu file.

rtDeclareVariable(float3, texcoord, attribute texcoord, );
rtTextureSampler<float4, 2> matmap;

I declare the rtTextureSampler before closest_hit function.
The result seems that the second picture cover the first one.
I know

context["matmap"]->setTextureSampler(sutil::loadTexture(context, mesh.mat_params[i].Kd_map, default_color));

may lead to the result.
But how can I give different matmap onto different mesh which uses one closest_hit function?
!(upload://tEjCTWWJkNcEPng39EUaLReFutW.jpeg)

Ok, I have solved my problem.
In my last reply, there is a mistake. Very sorry. Here it is.

context["matmap"]->setTextureSampler(sutil::loadTexture(context, mesh.mat_params[i].Kd_map, default_color));

The right one should be

material["matmap"]->setTextureSampler(sutil::loadTexture(context, mesh.mat_params[i].Kd_map, default_color));

And there is

optix::Material material = context->createMaterial();

Different materials.
!(upload://7N2koAd1OaBe4h0ov84hntrZ3B1.png)

Take a look at triangle_mesh.cu in the optix SDK. The texture coordinates are typically computed in the intersection program and propagated to the hit programs via attributes. Attributes are optix rtVariables which relay information between intersection and shading:

Relevant bits from triangle_mesh.cu:

rtBuffer texcoord_buffer;

rtDeclareVariable(float3, texcoord, attribute texcoord, );

template
static device
void meshIntersect( int primIdx )
{

if( intersect_triangle( ray, p0, p1, p2, n, t, beta, gamma ) ) {

if(  rtPotentialIntersection( t ) ) {

  ...
  if( texcoord_buffer.size() == 0 ) {
    texcoord = make_float3( 0.0f, 0.0f, 0.0f );
  } else {
    float2 t0 = texcoord_buffer[ v_idx.x ];
    float2 t1 = texcoord_buffer[ v_idx.y ];
    float2 t2 = texcoord_buffer[ v_idx.z ];
    texcoord = make_float3( t1*beta + t2*gamma + t0*(1.0f-beta-gamma) );
  }


}