Transformers for Scientific Data
Definition: Transformers for Unordered Sets
Transformers for Unordered Sets
For data without inherent ordering (point clouds, wireless users), transformers naturally handle sets since self-attention is permutation-equivariant. Positional encodings can encode spatial coordinates instead of sequence position.
Example: Transformer for OFDM Channel Processing
Apply attention across subcarriers for channel estimation.
Solution
Key idea
Treat each subcarrier as a token. Self-attention learns the frequency-domain correlation structure, replacing handcrafted interpolation with learned global mixing.
Why This Matters: Transformers for Resource Allocation
In multi-user systems, a transformer can process all users' channel states simultaneously, using attention to model inter-user interference. The permutation equivariance property is natural: resource allocation should not depend on user ordering.