EPIC’S Unreal Engine 5

I have been reading all weekend about Epic Games Unreal Engine 5 after viewing a 3D Real-Time Rendering of a demo game, what struck me the most are the technologies they will be offering, that brings a cinematic experience to whatever 3D universe you are creating that really allows an immersive experience not only in gaming but rather in any virtualized medium you might be considering.

The first technology that caught my attention and got me wowed is Nanite, it is a virtualized micro polygon geometry system that allows incredible geometric detail by rending billions of triangles in real-time. There has been always a great divide between film grade assets and game assets. Asset conversion was always painful, slow and results weren’t that real. Optimizations had to be gone to de-res the film assets for use in a real-time context, with low-resolution meshes, normal maps, and workarounds to emulate the massive geometric complexity.

Nanite virtualized geometry basically brings film-quality source assets with literally hundreds of billions of polygons to be imported directly into Unreal Engine 5. Assets generated other programs like CAD can all be imported into Unreal Engine 5, the fact that Nanite geometry is streamed and scaled in real-time so there is no more need for complex polygon budgets, polygon memory budgets, or draw count budgets. It also removes the need to work on details and manually author Levels of Detail (LODs). As Natite virtualizes the process, the results are simply mind-blowing.

The second technology which impressed me is Lumen which is a fully dynamic multi-bounce global illumination solution for a real-time scene and light changes. The system renders diffuse inter-reflection with infinite bounces and indirect specular reflections in huge, detailed environments, at scales ranging from kilometers to millimeters. Designers and developers can now create more dynamic scenes using Lumen, for example, changing the sun angle for the time of the day, turning on a flashlight, or blowing a hole in the ceiling, and indirect lighting will adapt accordingly. Lumen erases the need to wait for lightmap bakes to finish and to author light map SUVs.

The third technology which perfects the formula is the Convolution reverb, it is an audio technology that sound designers can use to capture the impulse responses (IRs) of any physical location (such as caves, churches, offices, or hallways) and process audio in real-time as though it were being played at that location. In effect, the IRs are capturing the reverberations of real locations, thereby allowing a new level of realism and immersion in environmental audio processing

In addition to capturing real-life reverberations, IRs can be derived from any other sound source: algorithmic reverb output, foley, VO (such as huffs or breaths), contact mics, and so on. IRs can be edited and processed like any other sound—they can be filtered, attenuated, reversed, chopped-up, edited, and more. In this way, the Convolution Reverb feature introduces a whole new dimension of experimentation.

Convolution reverb provides a data-driven alternative to more traditional algorithmic reverb techniques that simulate reverberation using a combination of delay buffers, filters, and various other DSP topologies.