Advertisement

Question about calculating tangent space for per fragment lighting (normal mapping)

Started by June 20, 2018 11:02 PM
1 comment, last by Zorinthrox 6 years, 7 months ago

I usually do my lighting per fragment, but since I'm learning I noticed I'm not very clear with a particular simple concept. I'm taking a tutorial where I'm taught to calculate the tangent space mat3 in the vertex shader, and then pass new calculated toCamera and toLight vectors in tangent space to the fragment shader.

But as of now, I'm calculating toCamera and toLight vectors in the fragment shader, so that means I should calculate the toTangentSpace mat3 in the fragment shader? I guess I'm not clear about when lighting is considered per vertex or not. Is it at the time I calculate toLight and toCamera vectors from the vertex shader? 

I'm almost sure I need to do this tangent space conversion calculation in the fragment shader in my case, but I'm guessing. Hope this makes sense :S

Update:

I ended up doing this (not sure I'm doing it right): First I calculated the tangent space mat3 in the vertex shader and multiplied the camera and fragment positions against it, since I'm passing it from the vertex shader:

 


// Vertex Shader
void main() {
    
    tangentSpace  = TangentSpace(model, tangents, bitangents, normals);
    uvs         = uv;
    nrms        = normalMat * normals;

    fragPos     = tangentSpace * vec3(model * vec4(vertex, 1.0));
    camPos      = tangentSpace * camera;


    gl_Position = mvp * vec4(vertex, 1.0);
}

The fragment shader I left the same, except for the following changes:


// Fragment Shader

// Remember: camera and fragPos are multiplied with tangent space in the vertex shader
vec3 toLight = tangentSpace * normalize(toLightRAW); 

// Calculating normals from the normalmap texture and then passing that normal to the lighting function.
vec4 normalValue = 2.0 * texture(normalTex, uvs) - 1.0;
vec3 bumpNormal = normalize(normalValue.rgb);
vec3 normal = bumpNormal;

 

It's been a while since I implemented this, but that should work so long as you are calculating the tangent basis correctly. However, since the tangent basis is being interpolated over the face for each fragment, you should really normalize it.

I think what I did in my thesis project was to pass the U and V unit vectors of the tangent basis to the fragment shader, normalized them, crossed them to get the third axis, then constructing the basis matrix from those three unit vectors. That saves a normalization and ensures the basis is very close to orthogonal without having to reorthogonalize the whole thing. You get better results, especially with normal mapping.

Also normalize the normal from the normal map; it's easy to think it won't make a difference, but depending on resolution it really can. 24bit Normal maps are pretty good at saving the direction but not the unit length of the normal (they can be better but I don't think it is standard practice to do the crazy Crysis 3 optimal normal mapping thing).

This topic is closed to new replies.

Advertisement