Message-Passing Detection
Exploiting the Sparse Factor Graph
The DD-domain input-output relation is a sparse equation: each received cell couples to at most transmit cells (one per path). This sparsity forms a sparse bipartite factor graph, and message-passing (MP) detection is the natural algorithm for such structures.
The point is that MP exchanges only sufficient statistics between graph nodes β a Gaussian approximation of the likelihood. Under the sparse factor-graph topology, this Gaussian message passing converges quickly and achieves near-ML performance. The algorithmic structure is exactly that of belief propagation (BP) from the FSI book, applied to the DD factor graph.
Definition: The DD-Domain Factor Graph
The DD-Domain Factor Graph
The DD factor graph has two sets of nodes:
- Variable nodes: one per transmitted DD cell, for and . Total: .
- Factor nodes: one per received DD cell, representing the likelihood constraint . Total: .
Each factor node has exactly incident variable nodes (the shifted cells that contribute via each path). The graph is bipartite, sparse, with degree exactly on both sides.
Theorem: Convergence of Gaussian Message Passing
For a DD channel with paths and integer-Doppler alignment, the Gaussian sum-product algorithm on the DD factor graph converges within iterations to a fixed point whose per-cell SNR matches the ML asymptote. In the high-SNR regime, the MP detector achieves BER that scales as β full diversity .
The factor graph is sparse with degree . Gaussian messages (just mean and variance) propagate through the graph; at each iteration, each variable node aggregates messages from its neighbors and updates. Because the graph has no short cycles at typical scales, BP converges. The sparsity guarantees the per-iteration complexity is , and the total is β roughly - ops for typical frame sizes.
Message updates
At iteration : Variable-to-factor: encodes the posterior marginal belief about from all adjacent factors except . Factor-to-variable: encodes the likelihood contribution from factor to variable given the messages from 's other neighbors.
Gaussian approximation
Messages are truncated to . This gives a tractable update: each message is two parameters.
Convergence
Under the sparse-graph structure with no short cycles, each iteration improves the mean estimates. Convergence in iterations is guaranteed by standard BP theory on tree-like graphs (the DD factor graph is tree-like at realistic scales).
Diversity
At convergence, each variable node's posterior aggregates evidence from all paths. The effective SNR is , giving chi-squared- fading β diversity .
MP-OTFS Detector
Complexity:Gaussian simplification: messages are pairs. The integral in step 5 reduces to a closed-form Gaussian update. Typical β; no significant gain beyond. Overall complexity: , which for is ops β feasible.
Key Takeaway
MP detection achieves full DD diversity. Unlike MMSE (diversity 1), message-passing on the sparse DD factor graph recovers the full -fold path diversity promised by Chapter 9. The BER slope is , matching ML. This is the main practical reason OTFS outperforms OFDM in uncoded or lightly-coded operation: MP exploits paths that OFDM's per-subcarrier detector cannot even see.
MP vs MMSE BER Comparison
Plot uncoded BER vs SNR for MMSE (diversity 1) and MP (diversity ) detection in the same OTFS frame. The MP curve is steeper β at low BER, MP can be 5-10 dB ahead of MMSE. Vary to see the diversity-order effect. This illustrates the qualitative difference between linear and nonlinear detection in OTFS.
Parameters
Example: MP on a 2Γ2 Frame with 2 Paths
An OTFS frame with and QPSK passes through a channel with 2 paths. Sketch the factor graph and count the number of messages exchanged in one iteration.
Graph structure
variable nodes, factor nodes. Each factor has incident variables. Each variable has incident factors (by symmetry). Total edges: .
Per-iteration messages
Factor-to-variable: one message per edge = 8 messages. Variable-to-factor: one message per edge = 8 messages. Total per iteration: 16 messages.
Complexity
Each message is a Gaussian (2 parameters) or a discrete distribution over (3 parameters after normalization). Per message: arithmetic. Per iteration: ops. For : 160 ops total. Fast.
Message Passing on a 4Γ4 DD Factor Graph
MP Variants: Sum-Product, Max-Product, and AMP
Several flavors of message-passing are used in the OTFS literature:
- Sum-product (exact BP): messages are full posteriors; gives symbol-wise MAP decisions. High memory, per message.
- Gaussian BP (MP-OTFS): truncates messages to Gaussian . Most efficient, close to ML at high SNR. This is the version of Algorithm AMP-OTFS Detector.
- AMP (approximate message passing): for Gaussian inputs, uses matched-filter + soft-threshold. Asymptotically optimal for i.i.d. matrices; less studied for structured matrices like OTFS.
- Max-product (min-sum): max rather than sum over each factor. Gives the most-likely sequence. Less accurate than sum-product but easier in noisy conditions.
For OTFS practice, Gaussian MP is the default. AMP has been explored (Chu, Yuan 2022) and shows ~1 dB gain over Gaussian MP in certain regimes, but at the cost of more sophisticated damping schemes to ensure stability.
Practical MP Implementation
Key implementation decisions:
- Damping factor : soft-weight the message update as . Prevents oscillation in difficult channels. is a typical default.
- Early termination: check convergence by computing . Stop when below a threshold (e.g., ). Saves iterations for easy channels.
- Parallelization: variable and factor updates can be fully parallelized within an iteration. GPU implementations achieve ops/ΞΌs on .
- Numerical stability: work in log-domain to prevent underflow of message magnitudes at low SNR.
With these optimizations, MP-OTFS is readily realtime at 5G NR frame rates ( frames/s). For LEO satellite at larger (), MP remains feasible with modern GPU hardware.
- β’
Damping factor 0.6β0.8 for stability
- β’
Early termination saves iterations
- β’
GPU parallelization: 2-3 orders of magnitude over sequential
Common Mistake: Short Cycles Break MP at Extreme Parameters
Mistake:
Deploying MP-OTFS at small frame sizes () expecting it to converge. At small , the factor graph has short cycles (the same data symbol appears as a neighbor of multiple factors), and BP can oscillate or converge to wrong fixed points.
Correction:
For small frames, use sphere decoding or exhaustive ML (the search space is small anyway). MP is designed for , where the graph is locally tree-like. At typical 5G NR-aligned OTFS sizes (), cycle effects are negligible and BP converges cleanly.