Jump to content

Voxel Ray Tracing and Physically-Based Rendering


Josh

3,664 views

 Share

Light is made up of individual particles called photons. A photon is a discrete quantum of electromagnetic energy. Photons are special because they have properties of both a particle and a wave. Photons have mass and can interact with physical matter. The phenomenon of "solar pressure" is caused by photons bombarding a surface and exerting force. (This force actually has to be accounted for in orbital mechanics.) However, light also has a wavelength and frequency, similar to sound or other wave phenomenon.

Things are made visible when millions of photons scatter around the environment and eventually go right into your eyes, interacting with photoreceptor cells on the back surface of your interior eyeball (the retina). A "grid" of receptors connect into the optic nerve, which travels into your brain to the rear of your skull, where an image is constructed from the stimulus, allowing you to see the world around you.

The point of that explanation is to demonstrate that lighting is a scatter problem. Rendering, on the other hand, is a gather problem. We don't care about what happens to every photon emitted from a light source, we only care about the final lighting on the screen pixels we can see. Physically-based rendering is a set of techniques and equations that attempt to model lighting as a gather problem, which is more efficient for real-time rendering. The technique allows us to model some behaviors of lighting without calculating the path of every photon.

One important behavior we want to model is the phenomenon of Fresnel refraction. If you have ever been driving on a desert highway and saw a mirage on the road in the distance, you have experienced this. The road in the image below is perfectly dry but appears to be submerged in water.

D30_5_524_0004_600.jpg.619945bbcfc0278fc136be12ebe79efe.jpg

What's going on here? Well, remember when I explained that every bit of light you see is shot directly into your eyeballs? Well, at a glancing angle, the light that is most likely to hit your eyes is going to be bouncing off the surface from the opposite direction. Since you have more light coming from one single direction, instead of being scattered from all around, a reflection becomes visible.

PBR models this behavior using a BRDF image (Bidirectional reflectance distribution function). These are red/green images that act as a look-up table, given the angle between the camera-to-surface vector and the incoming light vector. They look something like this:

brdfLUT.png

You can have different BRDFs for leather, plastic, and all different types of materials. These cannot be calculated, but must be measured with a photometer from real-world materials. It's actually incredibly hard to find any collection of this data measured from real-world materials. I was only able to find one lab in Germany that was able to create these. There are also some online databases available that these as a text table. I have not tried converting any of these into images.

Now with PBR lighting, the surrounding environment plays a more important role in the light calculation than it does with Blinn-Phong lighting. Therefore, PBR is only as good as the lighting environment data you have. For simple demos it's fine to use a skybox for this data, but that approach won't work for anything more complicated than a single model onscreen. In Leadwerks 4 we used environment probes, which create a small "skybox" for separate areas in the scene. These have two drawbacks. First, they are still 2D projections of the surrounding environment and do not provide accurate 3D reflections. Second, they are tedious to set up, so most of the screenshots you see in Leadwerks are not using them.

maxresdefault.thumb.jpg.51ef767838124b44d418876e38c6425e.jpg

Voxel ray tracing overcomes these problems. The 3D voxel structure provides better reflections with depth and volume, and it's dynamically constructed from the scene geometry around the camera, so there is no need to manually create anything.

Em3aaKeWEAIJ_ft.thumb.jpg.540cdc606f39a3c1900695115e1a068c.jpg

I finally got the voxel reflection data integrated into the PBR equation so that the BRDF is being used for the reflectance. In the screenshot below you can see the column facing the camera appears dull.

Image2.thumb.jpg.684743204d92b69a8fb9dab242a24abc.jpg

When we view the same surface at a glancing angle, it becomes much more reflective, almost mirror-like, as it would in real life:

Image3.thumb.jpg.756b9eb69f1480aeeb6216e487def7fd.jpg

You can observe this effect with any building that has a smooth concrete exterior. Also note the scene above has no ambient light. The shaded areas would be pure black if it wasn't for the global illumination effect. These details will give your environments a realistic lifelike look in our new engine.

  • Like 8
 Share

2 Comments


Recommended Comments

I'm interested in getting this to work on a small outdoor scene.  Are voxels automatically calculated when we load a model?  And does it only work with point lights at the moment?

  • Like 1
Link to comment
On 11/20/2020 at 9:36 AM, Lethal Raptor Games said:

I'm interested in getting this to work on a small outdoor scene.  Are voxels automatically calculated when we load a model?  And does it only work with point lights at the moment?

At the moment, it will only work in one small area around the origin (0,0,0) and only on entities that have had the MakeStatic() method called, and not on terrain.

  • Like 1
  • Thanks 1
Link to comment
Guest
Add a comment...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...