Complex-Valued Activation Functions
Definition: CReLU and Split Activations
CReLU and Split Activations
Apply real-valued activations independently to real and imaginary parts:
def crelu(z):
return torch.complex(F.relu(z.real), F.relu(z.imag))
CReLU is simple but not phase-equivariant. It treats real and imaginary parts independently.
Definition: modReLU: Phase-Preserving Activation
modReLU: Phase-Preserving Activation
modReLU applies a threshold to the magnitude while preserving phase:
def mod_relu(z, bias):
magnitude = z.abs()
return z * F.relu(magnitude - bias) / (magnitude + 1e-8)
Example: Comparing Complex Activations
Apply CReLU and modReLU to a set of complex points and visualise how they transform the complex plane.
Solution
Observation
CReLU maps negative real/imaginary parts to zero, creating quadrant-dependent behaviour. modReLU shrinks magnitudes below the threshold to zero while preserving direction.
Complex Activation Function Comparison
See how different complex activations transform the complex plane.