Cube Mipmap Generation is Ridiculously Slow

Started by
13 comments, last by JoeJ 6 months, 3 weeks ago

JoeJ said:
I wonder if such prefiltering is done for reflection probes too, not only for diffuse. I guess no, since we would need multiple cube maps for multiple cone angles. Likely it's better to take multiple samples instead.

There is this tutorial on specular image-based lighting, but I haven't looked at it in depth yet. It's needed especially for metallic materials where the diffuse is 0.

Advertisement

Aressera said:
There is this tutorial on specular image-based lighting, but I haven't looked at it in depth yet. It's needed especially for metallic materials where the diffuse is 0.

Hmm, then i guess many games indeed do this. It's surely higher quality than relying just on mip level selection.
There are however some issues i have with attempts to do PBS in real time:

The specular probes we have are usually at sparse locations, so the reflections we fetch and mix from the probes have a big error.
The error becomes most noticeable if we see a reflection of bright stuff which should be actually occluded.
Due to Fresnel, what we get is a bright silhouette of wrong reflections around all objects. We wanted accuracy, but what we got looks like rim lighting. It's wrong and ugly. The dominating artifact of the PS4 generation, imo.
Thus it's questionable if Fresnel is even worth it, if our approximation of GI is too bad, which it usually is.

Personally i would take PBS just as an inspiration and starting point, but i would end up at something which might not look realistic but good. Likely i would ignore Fresnel completely for rough materials such as concrete, and use it only for metals or materials where reflections are very expected.

But there is another problem with PBS, which seems very fundamental: It seems that ‘roughness’ lacks a proper definition, so that whole standard feels rather a matter of 'general agreement' than based on valid math and facts.
That's not meant as critique. I'm fine with the standard as is, and maybe improvements such as the extended Disney model fix the roughness flaw. But here is why i say it's a flaw:
I was working on a path tracer to generate reference images, and i've adopted PBS as explained from the same site you link to.
So i had to do some research on generating a random reflection vector depending on roughness. My research has brought me to a paper of Walter, which was cited to have introduced a definition of roughness. In the paper they used it to model glass with variable surface roughness.
If i have translated the paper to code correctly, this is what i've got:

static vec GlossyDir (float roughness) // Walter
			{ 
				float r0 = rand();
				float r1 = rand();

				float theta = atan(roughness*sqrt(r0));
				float phi = float(PI*2) * r1;

				float X = cos(phi) * sin(theta);
				float Y = sin(phi) * sin(theta);
				float Z = cos(theta);
				return vec(X,Y,Z); 
			}

But it's bad. For two reasons:

I would expect roughness of 1 samples the whole half space, ideally cosine weighted so we get Lambert Diffuse. And a roughness of 0 would give a perfect mirror with sharp reflections.
But no. That's not what we get. Roughness of 0 gives a much smaller cone angle than 180 degrees, so i can't model diffuse material from just roughness. Though, maybe my expectation is already wrong for some reason i do not yet understand.

Maybe, but the second issue leaves little room for doubts: If i set a high roughness, above function generates a distribution of rays which MUST be wrong.
Say we model a cone of 100 degrees. What we get then is that most rays end up at the surface of the cone. Meaning we have many rays at angle 100, but only few rays at angle 0. It causes visible circular artifacts in the final image. This can't be correct.

So i modeled my own function fulfilling all my expectations and fixing all issues, which was easy:

static vec GlossyDir3 (float roughness)
			{ 
				// simple generalization of glossy and diffuse reflection
				float r0 = rand();
				float r1 = rand();
				float r = sqrt(r1) * roughness;
				float a = r0 * float(PI*2);
				vec d(cos(a) * r, sin(a) * r, 0);
				d[2] = sqrt (1 - r*r);
				return d;
			}

It works as expected, has no artifacts and can do both perfect diffuse and mirror, but sadly i'm no longer conformal to the standard. I would need to figure out some remapping of roughness, to ‘fix’ standard PBS material textures so they appear as generally intended.
And i can not claim my function to be ‘correct’ either. Real world materials can have all kinds of distributions. I just made it cosine weighted because it makes sense to me, and it matches Lambert.
But at the end of the day, i realize our ‘standard PBS model’ is a gross simplification. Of coarse, but some of its assumptions are seemingly based on random choices which were not well thought at all.

Why do i tell all this?

You know, realtime GI is my primary work, so i have spent many years on chasing photorealism, and i still do.
And now we have even realtime PT in games, for those who can afford. So we are very close to that goal.

But - was it all worth it? Is this really what we expect from games? To look like real?
I'm no longer sure. We have very detailed games now, but Quake with its low poly environments played much better. Because the low poly geometry with it's stable lighting eases our perception, and it's easier to predict what will happen during the next second this way.
In other words: Realistic scenes hurt gameplay.
Besides, and that's really interesting: Old games which had impressive visuals back then, still feel impressive today. I play Quake2 right now, the extended Version without any RT. They have just increased texture and lightmap resolution and added some shadows, but it looks really good and awesome even if compared to the latest UE5 game.
So i wonder: Is realism the ‘best’ way to impress visually? Likely not? Maybe it's not that easy, and we need to be more creative, rather than accurate?

Idk. But i'm sure it's not good to accept ‘bad visuals’ just because the PBRT book says it's ‘correct’. And Fresnel is the best example for that, even if we can eventually fix the artifacts now with RT.

@JoeJ If I was to generate a random ray to sample a PBR BRDF in a path tracer I would first find a mathematical function that can be efficiently importance sampled. It doesn't have to match the BRDF that well, just the overall shape (of the specular lobe). One example is the Phong distribution, which is not too difficult to sample from. You can easily map roughness to Phong exponent. You also need the ability to calculate the PDF of the importance sampling distribution for a given sample. With that machinery you can sample an outgoing ray according to Phong, evaluate your BRDF (whatever that may be, e.g. Cook-Torrance), then divide it by the PDF of the Phong sampling. This will converge to the correct result, given enough samples.

$\int f(\vec{x}) d\Omega = \sum_{i=1}^{N} \frac{BRDF(\vec{x_i})}{p(\vec{x_i})}$

Interesting. I sort if did this, just that i was using simple cosine weighted distribution for everything iirc, so my IS was not that efficient. But for reference images i needed only diffuse materials, and i worked on some PBS only for extra fun.

But well, i have not yet arrived at the glorious path tracing future. Because i work on LOD, current HWRT is useless to me, since the shortsighted API designers forgot to give access to BVH data structures any proper LOD solution would require. : (

They may fix this some time, but till then everybody will use small and affordable APUs also on PC, and the dream about PT running on huge 2000$ dGPUs will be long forgotten.
(I do believe PT is useful for realtime, but only optional, to refine some other lighting approach which is actually fast enough.)

This topic is closed to new replies.

Advertisement