Chapter Summary
Chapter Summary
Key Points
- 1.
Complex-valued networks preserve phase structure. Using complex weights halves parameter count compared to unconstrained real networks and provides phase equivariance, which is natural for wireless signals.
- 2.
Real/imaginary stacking is the simplest approach. Concatenate real and imaginary parts as channels: (B, 2C, H, W). Compatible with all standard layers. Use complex-structured layers only when phase equivariance matters.
- 3.
Wirtinger calculus enables complex optimisation. PyTorch computes the conjugate Wirtinger derivative automatically. The loss must be real-valued; use |z|^2 for MSE on complex outputs.
- 4.
Differentiable forward models enable end-to-end learning. Implement the physics (FFT, channel, convolution) as PyTorch modules so gradients flow through the entire chain. This allows joint optimisation of transmitter and receiver.
- 5.
Choose activations carefully. CReLU (split ReLU) is simple but not phase-aware. modReLU preserves phase and is better for phase-sensitive tasks. Always avoid applying standard ReLU directly to complex tensors.
Looking Ahead
Chapter 29 introduces recurrent networks for sequential data. Chapter 30 covers attention mechanisms that can also operate on complex-valued sequences for wireless applications.