Content Creation / Rendering

Get Started with Neural Rendering Using NVIDIA RTX Kit

An image of a building with ornate columns and greenery at night, with lighting.

Neural rendering is the next era of computer graphics.  By integrating neural networks into the rendering process, we can take dramatic leaps forward in performance, image quality, and interactivity to deliver new levels of immersion. 

NVIDIA RTX Kit is a suite of neural rendering technologies to ray-trace games with AI, render scenes with immense geometry, and create game characters with photorealistic visuals.  

RTX Kit is now available on the /NVIDIA RTX/RTXkit GitHub repo and brings together several new and familiar RTX and AI components. Though it is possible to use each component individually, RTX Kit presents a unified location to access these essential rendering SDKs. 

Name DescriptionStatus
RTX Neural Shaders (new)Train and deploy neural networks within shaders to unlock new compression and approximation techniques for next-generation asset generation.Available now
RTX Neural Materials (new)Use AI to compress shader code of complex multi-layered materials for up to 5X faster material processing to bring real-time performance to film-quality assets.Get notified when RTX Neural Materials is available 
RTX Neural Texture Compression (new)Use AI to compress textures with up to 8x VRAM improvement at similar visual fidelity to traditional block compression at runtime.Available now
RTX Texture Filtering (new)Randomly samples textures after shading and filters difficult volumes, reducing artifacts and improving image quality.Available now
RTX Mega Geometry (new)Accelerate BVH building for cluster-based geometry systems like Nanite for real-time path tracing of millions of triangles.SDK Available now

Get notified when it is available in the NVIDIA RTX Branch of Unreal Engine 5 (NvRTX)
RTX Character Rendering (new)Set of tools to create path-traced hair and skin.Available now
DLSS 4 (new)Multi Frame Generation has been introduced for GeForce RTX 50 Series GPUs, allowing for generation of up to 3 frames per rendered frame.Available now through NVIDIA Streamline SDK and Unreal Engine 5 plugin
Reflex 2 (new)Reflex technologies optimize the graphics pipeline for ultimate responsiveness, providing faster target acquisition, quicker reaction times, and improved aim precision in competitive games. Reflex Low Latency available now

Get notified for Reflex Frame Warp availability
RTX Dynamic IlluminationLibrary of importance sampling algorithms that sample the most important lights in a scene and render them physically accurate. Provides implementations for three techniques: ReSTIR DI, GI and PT.Available now

ReSTIR PT coming soon to the SDK and the NVIDIA RTX Branch of Unreal Engine 5 (NvRTX)
RTX Global IlluminationScalable solution to compute multi-bounce indirect lighting. Provides implementations of two techniques: Neural Radiance Cache (NRC) and Spatially Hashed Radiance Cache (SHARC).Available now (NRC is experimental)
RTX Path Tracing Reference solution for optimized real-time path tracing utilizing several RTX technologies.Available now
NVIDIA Real-Time Denoisers Library of denoisers designed to work with low ray-per-pixel signals.Available now
NVIDIA Opacity Micro-MapEfficiently map intricate geometries onto triangles and encode their opacity for better ray tracing performance.SDK available now

Coming soon to the NVIDIA RTX Branch of Unreal Engine 5 (NvRTX)
RTX Memory UtilityCompaction and suballocation of acceleration structures to reduce memory consumption.Available now
SpatioTemporal Blue NoiseUtility tool containing some pregenerated blue noise textures and sample code for generating new blue noise texturesAvailable now
Shader Execution ReorderingPerformance optimization that unlocks the potential for better execution and memory coherence in ray tracing shaders. API available now for NVIDIA RTX 40 Series and later
Table 1. NVIDIA RTX Kit technologies

In this tutorial, we focus on new SDKs available today through NVIDIA RTX Kit:

  • RTX Neural Shaders
  • RTX Neural Texture Compression
  • RTX Texture Filtering
  • RTX Mega Geometry
  • RTX Character Rendering

While DirectX Support is not available yet for neural shading applications, Vulkan support is available today. For more information about the DLSS integration process, see How to Integrate NVIDIA DLSS 4 into Your Game with NVIDIA Streamline.

RTX Neural Shaders

A rendered image  shows a lantern on steps with a cloth underneath.
Figure 1. Example of a scene from Zorah, built in Unreal Engine 5, rendered using RTX Neural Shaders

NVIDIA RTX Neural Shaders bring small neural networks into programmable shaders. This technology framework enables the training and deployment of neural networks directly within shaders, enabling you to compress game data and shader code and approximate film-quality materials, volumes, geometry, and more in real time. 

Here’s how to get started.

Check prerequisites: Verify that your system meets the following requirements (Table 2).

GPU Architecture DriverCMakeVulkanVisual StudioWindows SDKSlang
Turing and newer572.163.24.31.3.2962022+10.0.22621.0v2025.3.3 
Table 2. NVIDIA RTX Neural Shaders requirements

Clone the /NVIDIA-RTX/RTXNS repo:  Install Git and Git-lfs, if you haven’t already and read through the information on the repo.

Build the samples: Follow the instructions located in the repo to build the solution using Visual Studio. 

Run the samples: There are three samples included with the SDK. Each one demonstrates how to use the SDK for various tasks:

  • Simple Inferencing:  Demonstrates how to implement an inference shader using some of the low-level building blocks from the SDK.
  • Simple Training:  Provides an introduction to training a neural network for use in a shader. 
  • Shader Training:  A more complex sample that shows how to train an MLP model on the Disney BRDF shader.
A screenshot shows a sphere with a neural network approximating a BRDF shader.
Figure 2. Simple Inferencing sample
Three images of the NVIDIA logo in different resolutions, visualizing the process of training a neural network on the logo.
Figure 3. Simple Training sample
An image of three spheres, visualizing training a neural network on a Disney BRDF shader.
Figure 4. Shader Training sample

Create a neural shader: Follow the How to Write Your First Neural Shader tutorial.  Use the sample code and the Library Usage Start Guide as reference.

In addition to the RTX Neural Shaders, NVIDIA provides two applications of neural shading through the following:

  • Neural Texture Compression: Implements compression and decompression of textures using neural networks, and enables transcoding into other compressed texture formats. 
  • Neural Materials: Implements compression of material data using neural networks. To be notified when this SDK is available, see the NVIDIA RTX Kit Notify Me Form.

RTX Neural Texture Compression

GIF compares block compression memory usage with Neural Texture Compression.
Figure 5. Scene from Zorah, built in Unreal Engine 5, showing Neural Texture Compression in use

One application of RTX Neural Shaders available today is RTX Neural Texture Compression. This SDK uses AI to compress textures more efficiently, reducing texture memory consumption by up to 8x compared to traditional block compression.

Check prerequisites: Verify that your system meets the following requirements (Table 3):

GPU Architecture DriverCMakeVulkanWindows SDK
Turing and newer570+3.281.310.0.22621.0
Table 3. RTX Neural Texture Compression requirements

Clone the /NVIDIA-RTX/RTXNTC repo:  Install Git and Git-lfs, if you haven’t already and read through the information on the repo.

Build the samples: Follow the instructions located in the repo to build the solution using Visual Studio. 

Run the samples: There are three applications included. Together, these tools show how NTC works and how to use it:

  • NTC Command-Line Tool (ntc-cli): Provides a tool for compression and decompression of material texture sets.
  • NTC Explorer: Enables interactive experimentation with neural texture compression, works as a viewer for NTC files.
  • NTC Renderer: Demonstrates how to render a GLTF model with NTC materials using Inference on Load or Inference on Sample.
A screenshot shows the user interface for the NTC Renderer application, with an image of a pilot helmet and mask on a stand.
Figure 6. NTC Renderer
A screenshot shows the user interface for the NTC Explorer application and an image of  cobblestones with moss.
Figure 7. NTC Explorer 

Integrate the SDK: To add NTC to your application, follow the Integration Guide.

Verify reduced memory utilization: Use NVIDIA NSight Systems to check memory usage and compare it to traditional textures. 

RTX Texture Filtering 

Five images of grass under different texture filters: HW, STF Bilinear, STF Bicubic, STF Gaussian sigma = .07, and STF Gaussian sigma =2.
Figure 8. Filter comparisons under magnification

RTX Texture Filtering improves quality and efficiency of texturing filtering, especially for AI compressed textures. It does this through stochastic sampling of texture filters, which enables the practical and efficient implementation of filtering after shading. RTX Texture Filtering is designed to enable easy integration of this technique into your shader libraries.

Check prerequisites: Verify that your system meets the following requirements (Table 4):

GPU Architecture DriverCMakeVisual StudioWindows SDK
DirectX Raytracing 1.1 API supported GPU and newer555.85+3.24.3+2022+10.0.20348.0+
Table 4. RTX Texture Filtering requirements

Clone the /NVIDIA-RTX/RTXTF repo: Install Git and Git-lfs, if you haven’t already and read through the information on the repo.

Observe the differences in filtering techniques: The GitHub repo provides many examples of filter comparisons to help you better understand the benefit that RTX Texture Filtering brings over other solutions.

Run the sample application: The repo includes an application that can be used to visualize the effects of RTX Texture Filtering.

A screenshot shows the user interface for the Texture Filtering sample, with a robot on a tiled path and objects casting shadows.
Figure 9. Texture Filtering sample

Integrate the RTX Texture Filtering shade library into your shader framework: Follow the instructions in the Integration Guide to add it to your application. 

RTX Mega Geometry

An image is split between a dragon and the corresponding polygons.
Figure 10. RTX Mega Geometry comparison 

As hardware becomes more powerful, the amount of geometric detail in real-time computer graphics is increasing rapidly. This growth is occurring in two areas: 

  • Higher instance counts (more objects in a scene)
  • Greater triangle density (more detailed individual objects)

RTX Mega Geometry addresses these challenges by accelerating bounding volume hierarchy (BVH) build speed for cluster-based systems like Nanite and intelligently compressing and caching clusters of triangles over many frames. This enables the streaming of various levels of detail and extreme triangle density all while path tracing.

RTX Mega Geometry is available today through an SDK and will be available in the NVIDIA RTX Branch of Unreal Engine 5 (NvRTX). 

Check prerequisites: Verify that your system meets the following requirements (Table 5):

GPU Architecture DriverCMakeWindows SDKVisual Studio
Turing and newer570+3.28+10.0.203482019+
Table 5. NVIDIA RTX Mega Geometry requirements

Clone the /NVIDIA-RTX/RTXMG repo: Install Git and Git-lfs, if you haven’t already and read through the information on the repo.

Build the sample: Follow the instructions located in the repo to build the solution using Visual Studio. 

Run the sample: Observe how Mega Geometry enables high image quality with low memory usage. 

A screenshot shows the user interface for the RTX Mega Geometry sample, with a scene from a game.
Figure 11. RTX Mega Geometry sample

Observe the effect of Mega Geometry: Use the built in profiler tool to view the impact of Mega Geometry in a variety of preset scenes. 

Review the sample code and integrate the API: To add Mega Geometry to your application, take a look at the sample application and refer to the API calls used. 

RTX Character Rendering 

An image of a person rendered using RTX Character Rendering.
Figure 12. Sample using RTX Character Rendering

RTX Character Rendering consists of four algorithms:

  • Subsurface Scattering (SSS): Renders skin with accurate lighting and translucency. It enables path-traced skin, which enhances realism. The SDK implements a combined (SSS) solution which extends the SOTA Burley Diffusion Profile with a single scattering term.
  • Linear Swept Spheres (LSS): Adds the NVIDIA Blackwell-accelerated sphere and curve primitive for strand-based, path-traced hair, which brings added depth and volume. This algorithm is only compatible with NVIDIA RTX 50 Series GPUs.
  • Enhanced analytical Bi-Directional Scattering Distribution Function (BSDF): Provides shading for strand-based hair.
  • Disjoint Orthogonal Triangles Strips (DOTS): Provides high-quality strand-based hair for all GPUs.

Check prerequisites: Verify that your system meets the following requirements (Table 6):

GPU Architecture DriverCMakeVulkanWindows SDK
Volta and newer (Blackwell required for LSS)570+3.24.31.3.268+10.0.20348+
Table 6. NVIDIA RTX Character Rendering requirements

Clone the /NVIDIA-RTX/RTXCR repo:  Install Git and Git-lfs, if you haven’t already and read through the information on the repo.

Build the sample: Follow the instructions located in the repo to build the solution using Visual Studio. 

Run the sample: Use this opportunity to try out the different path tracing techniques available within the SDK and see how it fits within the context of a game environment. 

A screenshot shows the user interface for the RTX Character Rendering sample, with a noisy image of a person standing in front of an ocean sunset.
Figure 13. RTX Character Rendering sample 

Integrate the SDK: To add the RTX Character Rendering SDK to your application, follow the Integration Guide.

Summary

Get started with developing with NVIDIA RTX Kit today. To see the latest in neural graphics technologies, be sure to check out NVIDIA at GDC

Discuss (0)

Tags