Complex-Valued Activation Functions

Definition:

CReLU and Split Activations

Apply real-valued activations independently to real and imaginary parts:

CReLU(z)=ReLU((z))+jReLU((z))\text{CReLU}(z) = \text{ReLU}(\Re(z)) + j \cdot \text{ReLU}(\Im(z))

def crelu(z):
    return torch.complex(F.relu(z.real), F.relu(z.imag))

CReLU is simple but not phase-equivariant. It treats real and imaginary parts independently.

Definition:

modReLU: Phase-Preserving Activation

modReLU applies a threshold to the magnitude while preserving phase:

modReLU(z;b)={zzbzzb0z<b\text{modReLU}(z; b) = \begin{cases} z \cdot \frac{|z| - b}{|z|} & |z| \ge b \\ 0 & |z| < b \end{cases}

def mod_relu(z, bias):
    magnitude = z.abs()
    return z * F.relu(magnitude - bias) / (magnitude + 1e-8)

Example: Comparing Complex Activations

Apply CReLU and modReLU to a set of complex points and visualise how they transform the complex plane.

Complex Activation Function Comparison

See how different complex activations transform the complex plane.

Parameters