3. Rendering Pipeline
Visionary adopts a Hybrid Rendering Pipeline. Since Gaussian Splatting uses a custom Compute Shader-based rasterization technique, it cannot be drawn directly using the standard renderer.render of Three.js.
Therefore, Visionary introduces the GaussianThreeJSRenderer class as a rendering coordinator, specifically responsible for managing depth blending and occlusion relationships between standard 3D objects and Gaussian particles.
3.1 Renderer Initialization
Before using the rendering pipeline, ensure the WebGPU environment is correctly bridged with the Three.js context.
📎 Related Modules:
initThreeContextbelongs to the WebGPUContext inside the 01-app module; the renderer coordination layer is documented in the 12-three-integration module.
import { initThreeContext } from 'src/app/three-context';
// Initialize renderer containing WebGPU adapter configuration
// Note: Internally loads a dummy ONNX model to activate ORT's WebGPU backend
const renderer = await initThreeContext(canvasElement);
// Default Configuration:
// - antialias: true
// - powerPreference: 'high-performance'
// - pixelRatio: Math.min(window.devicePixelRatio, 2)
3.2 The Mixed Loop
To correctly handle Mesh occluding Splats, the API must be called strictly in the following order.
📎 Related Modules: This sequence has a complete command flow and diagrams in the Auto Depth section of 12-three-integration.
Loop Logic Detail
function animate() {
const currentTime = (Date.now() - startTime) / 1000.0;
// 1. Update 4D Dynamic Models (Update)
// Calculate particle deformation at the current time point.
// This is an async operation (contains await), but usually non-blocking in the loop.
// Internally handles Camera View/Projection matrix conversion.
gaussianRenderer.updateDynamicModels(camera, currentTime);
// 2. Render Background & Standard Mesh (Scene Pass):
// a. Render standard Three.js scene to sceneDepthRT (contains Color and Depth).
// b. Capture the Depth Buffer for use by the Gaussian rasterizer.
// c. Blit (Copy) the render result to the screen Canvas.
gaussianRenderer.renderThreeScene(camera);
// 3. Draw Gaussian Particles (Splatting Pass):
// a. Read the Depth Buffer generated in step 2 to implement Mesh occlusion of Splats.
// b. Execute Gaussian rasterization, overlaying on the screen Canvas.
gaussianRenderer.drawSplats(renderer, scene, camera);
}
3.3 Core Class: GaussianThreeJSRenderer
This class manages all logic for interacting with the WebGPU rasterizer.
📎 Related Modules: See the 12-three-integration module API reference for details.
Constructor
import { GaussianThreeJSRenderer } from 'src/app/GaussianThreeJSRenderer';
import { GaussianModel } from 'src/app/GaussianModel';
const gaussianRenderer = new GaussianThreeJSRenderer(
renderer: THREE.WebGPURenderer,
scene: THREE.Scene,
gaussianModels: GaussianModel[]
);
Key Methods
| Method Name | Parameters | Description |
|---|---|---|
updateDynamicModels |
(camera: Camera, time?: number) |
Updates vertex states for all dynamic (ONNX) models. |
renderThreeScene |
(camera: Camera) |
Replaces standard renderer.render. It enables autoDepthMode to capture full scene depth. |
drawSplats |
(renderer, scene, camera) |
Executes final rasterization drawing. Returns boolean indicating success. |
appendGaussianModel |
(model: GaussianModel) |
Dynamically adds a newly loaded model to the pipeline. |
removeModelById |
(modelId: string) |
Removes a model and cleans up resources. |
3.4 Auto Depth Mode
Visionary enables Auto Depth Mode by default.
📎 Related Modules: Depth capture, the Blit pass, and diagnostic tooling are fully described in 12-three-integration/architecture.
- Principle:
renderThreeSceneusesTHREE.HalfFloatType(16-bit float) to create a RenderTarget. When rendering standard Meshes, depth information is written tosceneDepthTexture. - Interaction: When
drawSplatsstarts the WebGPU RenderPass, it loads this depth texture as thedepthStencilAttachment(LoadOp: 'load'). - Effect: Gaussian particles undergo Depth Testing during rasterization, allowing them to be correctly occluded by standard Meshes (e.g., a person walking behind a wall generated by a Gaussian model).
Note: If you use native
renderer.renderinstead ofrenderThreeScene, Gaussian models will not perceive Meshes in the scene, resulting in "see-through" artifacts.
3.5 Gaussian Model Object (Recap)
GaussianModel inherits from THREE.Object3D, meaning you can manipulate it like a normal Mesh.
📎 Related Modules: Additional information on model sync, animation, and AABB handling is available in the 16-models module.
Transform Sync (Auto-Sync)
Visionary implements a transform interception mechanism. When you modify position, rotation, or scale, data is automatically synced to the GPU.
import { GaussianModel } from 'src/app/GaussianModel';
const model = gaussianModels[0];
// The following operations automatically trigger syncTransformToGPU; no manual update needed.
model.position.set(10, 0, 0);
model.rotation.y += 0.1;
model.scale.setScalar(2.0);
Common Control APIs
import { GaussianModel } from 'src/app/GaussianModel';
// Adjust particle size (does not change model scale, only splat radius)
model.setGaussianScale(1.5);
// Adjust transparency
model.setOpacityScale(0.8);
// Culling threshold (optimize performance)
model.setCutoffScale(0.5);
// Animation control
model.startAnimation(1.0); // Play at 1.0x speed
model.setAnimationTime(2.5); // Jump to 2.5 seconds