As a graphics programmer, the allure of dynamic global illumination (GI) and its potential to bring unparalleled realism to games has always captivated me. When NVIDIA unveiled their Dynamic Diffuse Global Illumination (DDGI) technology, I recognized an opportunity to augment the Hazel Engine with this state-of-the-art feature.
Before embarking on the integration journey, I dedicated a significant amount of time to comprehend the underlying principles of DDGI, though, it's always gonna be a mix of wraping my head around concepts, (ahead of the actual implementation), and a whole lot of gotcha's during the actual work. So, NVIDIA's DDGI, a real-time ray-traced global illumination solution, approximates indirect diffuse lighting. It harnesses the power of contemporary GPUs to dynamically calculate lighting conditions, by shooting a limited amount of ray's each frame from probes to their surroundings, and accumulting them temporally, all cached into these irradiance probes. Thinking about this, the first thing that comes to mind is "How's that even dynamic?", the thing is, GI can be consdered a low frequncy signal, meaning it doesn't have be pixel to pixel kind of detail like specular reflections. So instead, DDGI allows some accumulation over frames and allows the fine tuning how fast the lighting can change in a volume. The DDGI documentation provided by NVIDIA was an invaluable resource during this learning process, offering a comprehensive guide to understanding and integrating DDGI.
The first step was to prepare the Hazel Engine for DDGI. This involved adjusting the engine to support Vulkan Ray Tracing features. Then, I integrated the DDGI shaders into Hazel's rendering pipeline. It wasn't easy to get the DDGI calculations to work well with Hazel's existing lighting model. It required a deep understanding of both Hazel's rendering pipeline and the DDGI shader.
After integrating DDGI, the difference was clear. Scenes rendered in Hazel now had a new depth and realism, thanks to the dynamic global illumination from DDGI. But, the initial implementation was quite performance-intensive. So, I spent time profiling and optimizing the DDGI integration. This involved improving the ray-tracing pipeline. DDGI is already highly optimized, so the goal was to make sure that my implementation didn't slow it down.
In the future, I plan to implement light baking for hardware without accelerated ray tracing capabilities. Light baking is a technique where you pre-calculate the lighting in a scene and store it in textures, which can then be used during rendering. This can be a good alternative for hardware that doesn't support real-time ray tracing.
Integrating NVIDIA's DDGI into the Hazel Engine was a challenging but rewarding experience. It, not only improved the visual quality of the engine, but also gave me a better understanding of modern real-time rendering techniques, and it's so much faster than path tracing. One important lesson from this experience is the importance of planning ahead. As the saying goes, “Failing to plan is planning to fail.” This was particularly true in this case, where a good understanding of the technique and a well-planned strategy were crucial for a successful and feasible integration.
Here are some images that demonstrate the results of my RTXGI implementation.
Click on an image to view it in bigger size.