RadarSplat and Automotive Applications
Gaussian Splatting Meets Automotive Radar
Automotive radar presents unique challenges for neural scene representations: the sensor outputs sparse, noisy 3D point clouds (not images), operates with FMCW waveforms that impose specific range-Doppler processing, and must be fused with camera and LiDAR for reliable perception. RadarSplat (Niedermayr et al., 2024) and GSpaRC (Dong et al., 2024) adapt 3DGS to this automotive setting, enabling radar point cloud synthesis, augmentation for autonomous driving, and fast channel reconstruction.
Definition: RadarSplat
RadarSplat
RadarSplat represents an automotive scene as a set of 3D Gaussians where each primitive carries radar-specific attributes:
where:
- is the radar cross-section (RCS),
- is the radial velocity (Doppler),
- encodes additional features (e.g., material class).
The rendering process models the FMCW radar pipeline: each Gaussian generates a response in the range-Doppler-angle domain, and the rendered "image" is a synthetic radar point cloud.
Definition: FMCW-Aware Gaussian Rendering
FMCW-Aware Gaussian Rendering
In RadarSplat, the rendering equation is adapted for FMCW radar by incorporating the range-Doppler response of each Gaussian. For a Gaussian at position , the beat signal contribution to radar receiver at is:
where is the range, is the chirp duration, and is the sweep bandwidth. After range-Doppler FFT processing, the Gaussian appears as a peak in the range-Doppler map at with amplitude proportional to .
The total radar response is the coherent sum over all Gaussians:
Theorem: Gaussian Size and Radar Resolution
For an FMCW radar with bandwidth and coherent processing interval , the minimum Gaussian scale that can be resolved is:
where the first term is the range resolution and the second is the angular resolution for an array of elements. Gaussians smaller than are indistinguishable from point scatterers and should be parameterised as isotropic ().
The radar cannot "see" structure finer than its resolution cells. Making Gaussians smaller than the resolution cell adds parameters without adding information, leading to overfitting. This is the RF analog of the optical resolution limit that constrains Gaussian size in optical 3DGS.
Range resolution
The FMCW range resolution is (Chapter 9). A Gaussian with radial extent produces a range response indistinguishable from a point at .
Angular resolution
The angular resolution of an -element ULA with half-wavelength spacing is (Chapter 7). At range , this corresponds to a cross-range resolution of . The Gaussian cross-range extent must satisfy to avoid sub-resolution artefacts.
Combined constraint
Taking the maximum of both constraints gives the stated bound.
Definition: GSpaRC --- Gaussian Splatting with Physics-Augmented Rendering for Compact Scenes
GSpaRC --- Gaussian Splatting with Physics-Augmented Rendering for Compact Scenes
GSpaRC (Dong et al., 2024) extends RadarSplat by incorporating physics-based propagation models into the rendering pipeline:
- Free-space path loss: Each Gaussian's contribution is attenuated by , the radar equation path loss.
- Multi-bounce rendering: Gaussians can act as secondary sources, modelling double-bounce scattering common in urban environments (e.g., ground-wall dihedral reflections).
- Compact representation: A sparsity-promoting loss encourages removing unnecessary Gaussians, achieving -- fewer primitives than vanilla RadarSplat.
The physics augmentation improves generalisation to novel viewpoints and reduces the required training data by encoding known propagation laws rather than learning them from data.
Example: Synthetic Radar Point Cloud for Training Data Augmentation
An autonomous driving team has 10 hours of camera + LiDAR data but only 2 hours of co-registered radar data (due to a sensor failure). They need balanced multi-modal training data. Describe how RadarSplat generates synthetic radar point clouds for the missing 8 hours.
Scene reconstruction
For each scene segment (a few hundred metres of driving), fit a RadarSplat model using the 2 hours of co-registered data. The Gaussians learn the radar reflectivity of the environment (buildings, cars, road infrastructure).
Novel viewpoint rendering
For the 8 hours of camera+LiDAR-only data, use the ego-vehicle trajectory (from SLAM or GPS/INS) as the "camera" for RadarSplat. Render synthetic range-Doppler maps and extract point clouds via CFAR detection (Chapter 9).
Quality assessment
Compare the synthetic point clouds with held-out real radar data. Typical metrics:
- Chamfer distance between real and synthetic point clouds: -- m for static scenes.
- Velocity accuracy: MAE m/s for moving objects.
- Detection rate: of real detections are reproduced in synthesis.
Training
Mix real and synthetic radar data for training the perception network (e.g., radar-camera fusion for 3D object detection). The synthetic augmentation typically improves mAP by -- percentage points.
Gaussian RCS and Opacity Visualisation
Visualise a collection of 2D Gaussians with radar cross-section encoded as colour intensity and opacity controlling visibility. Adjust parameters to see how the representation changes.
Parameters
Common Mistake: Coherent vs Incoherent Summation in Radar Splatting
Mistake:
Treating the radar splatting as an incoherent power summation (as in the optical case) rather than a coherent field summation.
Correction:
In optical 3DGS, the alpha-compositing sums intensities (powers) because camera pixels detect intensity. Radar, however, operates on complex-valued signals: the IF signal is the coherent sum of contributions from all Gaussians. After range-Doppler FFT processing, nearby Gaussians can interfere constructively or destructively depending on their phase relationship.
RadarSplat must track the phase of each Gaussian and perform coherent summation in the complex domain before computing the detected power . Ignoring phase leads to systematic overestimation of radar returns (by treating all contributions as constructive).
Timing Constraints for Automotive Radar Synthesis
Automotive perception systems require sensor data at specific rates:
- Radar: 10--20 frames per second (50--100 ms per frame).
- Camera: 10--30 FPS.
- LiDAR: 10--20 FPS.
For RadarSplat to be useful in simulation or hardware-in-the-loop testing, the rendering time must be ms per frame. With Gaussians (typical for a 100 m road segment), the tile-based rasteriser achieves ms per frame on an A100 GPU, well within the timing budget. This sub-millisecond-per-Gaussian rendering is a key advantage of splatting over ray-tracing-based simulators (e.g., Sionna RT), which require ms per frame for comparable scene complexity.
- β’
Rendering latency must be ms for real-time simulation
- β’
GPU memory scales linearly with (approx 200 bytes per Gaussian)
RadarSplat vs GSpaRC
| Property | RadarSplat | GSpaRC |
|---|---|---|
| Path loss model | Learned (implicit) | Physics-based () |
| Multi-bounce | No (single-bounce only) | Yes (double-bounce rendering) |
| Typical Gaussians | -- | |
| Sparsity loss | Standard pruning | Explicit on opacity |
| Novel view generalisation | Good ( m Chamfer) | Better ( m Chamfer) |
| Training time | min | min |
| Rendering speed | FPS | FPS |
Gaussian Splatting for RF Scene Reconstruction
The CommIT group is investigating the application of 3DGS to the unified forward model developed in Chapter 7. The key research question is whether representing the reflectivity as a set of Gaussian primitives (instead of a voxel grid or neural field) can simultaneously improve reconstruction quality and enable real-time rendering for digital twin applications.
Preliminary results show that the Kronecker structure of (Chapter 7) can be exploited to accelerate the Gaussian rendering step: the spatial and frequency components of the forward model factorise, allowing separate 2D splatting operations rather than a single 3D operation. This reduces the per-iteration cost from to , where is the number of subcarriers and the number of spatial samples.
FMCW-Aware Rendering
A rendering process that models the frequency-modulated continuous wave (FMCW) radar signal processing chain: each scene primitive generates a beat signal contribution with range- and Doppler-dependent phase, and the rendered output is the range-Doppler map after FFT processing (rather than an image).
Related: Radio Radiance Field
Radar Cross-Section (RCS)
A measure of how much electromagnetic energy a target reflects back toward the radar, defined as , where and are the scattered and incident field amplitudes. In RadarSplat, each Gaussian primitive carries an RCS attribute that determines its radar reflectivity.
Related: FMCW-Aware Rendering
Key Takeaway
RadarSplat and GSpaRC adapt 3D Gaussian Splatting for automotive radar by augmenting each Gaussian with radar cross-section and Doppler velocity attributes, and replacing the optical rasteriser with an FMCW-aware rendering pipeline that produces range-Doppler maps. Physics-augmented rendering (GSpaRC) improves generalisation and compactness. A critical difference from the optical setting is the need for coherent (complex-valued) summation to correctly model radar interference patterns.