Fidelity-first Ray Tracing in the Heterogeneous Compute Era

About the Author
Jim (James L.) Jeffers
@jamesljeffers

Sr. Director, Sr. Principal Engineer, Intel Advanced Rendering & Visualization Board member: Academy Software FoundationOpen 3D Foundation

Fidelity-first Ray Tracing in the Heterogeneous Compute Era

The past years have shown that for visually-focused applications for professionals to consumers, they simply prefer a better image fidelity approach to meet their real-world visual experience. The relatively rapid transition in television quality from SD to HD and 4K are examples. For the past 10+ years, the graphics method of choice, especially for most visually-compelling animated films and VFX for real-world matching fidelity has been Ray Tracing. This is about the same amount of time our Intel® Embree library, which won an Academy Technical Achievement award this year, has been used in filmmaking. 

Note that “ray tracing” is not a singular method that ‘always’ delivers full realism. It is a continuum of techniques mixing the physics of light, geometry, and other mathematics going from simple hard shadows to complex techniques like subsurface scattering skin translucency. Of course, the higher the fidelity, the more computationally intense, but also optimizations and AI techniques help reduce the time to image. 

We are now reaching a point that commodity computing platforms from laptops to workstations to servers have computing and AI features to deliver the leap in visual fidelity that ray tracing methods deliver. Moving forward you can expect usages surging that utilize ray tracing delivering a leap in visual impact touching everyone more often. For example, there are already diverse types of applications and products benefitting through high-fidelity ray tracing across a range of industry segments. Creators are challenged to create real life in 3D, from alien faces to explosions, using physical-based models to ensure photorealism. Scientists working to solve some of the world’s most difficult problems, want high-fidelity visualization to understand complex phenomena with large data sets. Product engineering and architects want cost-effective 3D design tools. If it’s visual and realism improves the results, ray tracing will be there!

As ray tracing continues to evolve, we are delivering on what customers ask for in driving innovation in our products. For example, the visualization to build the Covid-19 model first created by the University of Chicago in collaboration with others1 used AI-based denoising components built in the Intel® oneAPI Rendering Toolkit to help researchers understand the data sets. And in the world of professional 3D modeling, Maxon’s well-known Cinema 4D product uses the same AI denoise library to achieve film quality VFX fidelity in a fraction of the rendering time. 
 

Fidelity-first Approach

High-fidelity, physically-based interactive 3D compute continues to address exploding model sizes and model complexity, and our AI-based Intel® Open Image Denoise (part of the Intel oneAPI Rendering Toolkit), delivers the most accurate photorealistic images available. This dramatically reduces typical Monte Carlo path-traced final frame render times. This industry-leading image fidelity has led to integration into many of the world’s most popular creator, gaming, scientific and product design tools—such as Blender, Chaos Group’s Corona and V-Ray, Cinema 4D, Kitware ParaView, Unity and UnReal engine,—with the most recent being AutoDesk’s Arnold renderer that is widely used in film. 

During our own quality comparisons between Intel Open Image Denoise and Nvidia OptiX using popular industry models below, we concluded that in all these cases, AI-driven denoise image fidelity using standard SSIM metrics measuring accuracy vs. a “Ground Truth” photoreal image was consistently better using Intel Open Image Denoise.

In all the following examples, Intel Open Image Denoise delivered superior image quality on well-known, readily available 3D models. Testing was conducted using Intel Open Image Denoise 1.4.1 with prefiltering and the latest publicly available Nvidia OptiX 7.3 with Nvidia driver version 470.57.02.2, 3

Academy Interior

Publicly available model, image courtesy of Chaos Group.

Figure 1: Academy Interior 4 samples per pixel (spp) with no denoising.

 


Figure 2: Academy Interior 4spp denoised using Intel Open Image Denoise delivering results at 0.950 of “ground truth.” See the clarity of the stature outlines and clear dark shadows on the couch compared to Figure 3.

Figure 3: Academy Interior 4spp denoised using Nvidia OptiX. In the close-up, the statute outline is blurry and the shadows on the couch are not as distinct as in Figure 2.

Cabins Exterior 
Publicly available model, image courtesy of Evermotion “15th Anniversary Collection.”

Figure 4: Cabin Exterior 16spp with no denoising.

Figure 5: Cabin Exterior 16spp denoised using Intel Open Image Denoise delivering results at 0.890 of “ground truth.” See how the roof tile lines and grain of the wood are more distinct than in Figure 6. 

Figure 6: Cabin Exterior 16spp denoised using Nvidia OptiX.

Where both highest fidelity and/or availability to run on virtually all computing platforms sold in the last 10 years are paramount, Intel Open Image Denoise is proving to be one of the best solutions today. Some may say, “Wait, I can get ‘close’ with OptiX on a recent GPU, and it delivers the image much faster than Intel Open Image Denoise.” While in some cases ‘today’ that is true, when Intel launches its upcoming Xe GPUs – you will have even more options to get Intel Open Image Denoise highest AI-based fidelity at interactive and real-time rates.

Next Gen Sustainable Development is Open, Heterogeneous/Cross-architecture, & High Fidelity

The industry has made its preference for open, flexible, cross-architecture approaches clear. Our customers have demanded that workflow integration and code optimization for each subsequent generation of more performant hardware become sustainable and boost productivity instead of becoming a burden on cost and time. As such, artists, developers, and researchers alike have embraced the oneAPI industry initiative, Intel’s oneAPI Rendering Toolkit and its fidelity-leading Intel Open Image Denoise AI-enhanced software. 

Intel’s CPU + XPU approach, one that embraces all system compute resources, is pushing new boundaries for a wide array of “No transistor left behind” platform applications. Intel’s upcoming Xe family of GPUs, in conjunction with Intel® Xeon® processor and Intel® Optane™ Persistent Memory-based platforms, will best address tomorrow’s most demanding visual and computational workloads. This open and flexible approach will help enable Intel to keep delivering some of the highest-fidelity results without creating undue burdens that result from being trapped in walled gardens. 

The path ahead for compute-intensive visual applications clearly lies in a combination of high-fidelity ray and path-traced graphics methods, open development and standards, and efficient heterogeneous computing as the future-forward dominant platform theme. This democratization not only benefits those in the film industry but spans many industry segments from gaming to 3D product and architectural design, to high-performance computing applications that aim to solve some of society’s most pressing problems.

 

Ray Tracing: One Size Does Not Fit All
Read

TACC Frontera supercomputer optimizes scientific visualization & memory (Render Kit/Optane)
Read

Intel Embree Wins Academy Scientific & Technical Award
Read

 
 

Intel Activities at SIGGRAPH 2021
Watch

Ray Tracing Innovation using Intel® oneAPI Rendering Toolkit
Watch

Tangent Studios’ Jeff Bell Shares How Intel Helps Accelerate Rendering
Watch

Chaos Group’s Phil Miller Shares How Intel® oneAPI Tools Boost V-Ray & Corona
Watch

Bentley’s Paul Chapman Shares How Intel Accelerates its Car Configurator
Watch

High Fidelity Rendering Unleashed with Intel
Watch

Intel® oneAPI Rendering Toolkit

Download FREE the full package of Intel® oneAPI Rendering Toolkit – including the Oscar award-winning Intel® Embree, Intel® Open Image Denoise, Intel® OpenSWR, Intel® Open Volume Kernel Library, Intel® OSPRay, Intel® OSPRay Studio, Intel® OSPRay for Hydra, and Rendering Toolkit Utilities.

Get It Now

See All Tools

Get just the individual component: Intel® Open Image Denoise

Learn more about oneAPI, download more Intel® oneAPI tools or use in the Intel® DevCloud.

NOTICES & DISCLAIMERS

1Citation: "A Multiscale Coarse-Grained Model of the SARS-CoV-2 Virion." Yu et al., Biophysical Journal, Jan 5. 2021, https://doi.org/10.1016/j.bpj.2020.10.048. Authors University of Chicago: Alvin Yu, Alexander Pak, Peng He and Viviana Monje-Galvan. Co-authors: Lorenzo Casalino, Zied Gaieb, Abigail Doommer and Rommie Amaro with the University of California San Diego. Computational resources were provided by the Research Computing Center at the University of Chicago, Frontera at the Texas Advanced Computer Center, and the Pittsburgh Super Computing Center. oneAPI podcast with Teodora (Dora) Szasz: SciVis Unveils a Billion Cells, Covid-19 & Invisible Monsters.

2Image

Image credit

Image Procurement

Academy Interior

Scene provided by Chaos Czech a.s. www.corona-renderer.com

Corona Rendered results shared for Intel testing summary.

Cabins Exterior

Scene by Evermotion “15th Anniversary Collection”

Evermotion 3D model free from CG River www.cgriver.com/products/15th-anniversary-collection-evermotion-3d-models-architectural-visualizations

Dark Interior

Dark Interior Scene (Classical) by Entity Designer

Blender Market $26.57 for model and royalty free license
blendermarket.com/products/dark-interior-scene

Food

Scene by Charles Nandeya Ehouman (Sharlybg)

Model download from LuxCoreTestScenes/reference.png at master · LuxCoreRender/LuxCoreTestScenes · GitHub

Junk Shop

The Junk Shop by Alex Treviño. Original Concept by Anaïs Maamar.

Free Blender Scene download cloud.blender.org/p/gallery/5dd6d7044441651fa3decb56

 

3Testing Date: Fidelity test results are based on testing by Intel as of July, 2021 and may not reflect all publicly available security updates. Configuration Details and Workload Setup: Academy Interior Scene is a publicly available model provided by Chaos Czech a.s. www.corona-renderer.com. Cabins Exterior Scene is a publicly available model by Evermotion “15th Anniversary Collection.” Testing was conducted on quality, not “time performance” so the quality measurements are related to software only. Scenes were denoised and optimized using Intel Open Image Denoise 1.4.1 with prefiltering, and the latest publicly available Nvidia OptiX 7.3 with Nvidia driver version 470.57.02, and Ubuntu 20.4. Image quality metric: Structural Similarity Index (SSIM): higher is better; measured for the tone mapped sRGB versions of the images. Hardware used was Intel® Core™ I9-10980XE and Nvidia GeForce RTX 3090. 

Performance varies by use, configuration, and other factors. Learn more at www.Intel.com/PerformanceIndex. 
Performance results are based on testing as of dates shown in configurations and may not reflect all publicly available updates. 
No product or component can be absolutely secure. 
Your costs and results may vary. 
Intel technologies may require enabled hardware, software or service activation.
Intel does not control or audit third-party data. You should consult other sources to evaluate accuracy. 
© Intel Corporation.  Intel, the Intel logo, and other Intel marks are trademarks of Intel Corporation or its subsidiaries.  Other names and brands may be claimed as the property of others.