Research2026-04-22
Sessa: Selective State Space Attention
Source: Arxiv CS.AI
arXiv:2604.18580v2 Announce Type: replace-cross Abstract: Modern sequence modeling is dominated by two families: Transformers, whose self-attention can access arbitrary elements of the visible sequence, and structured state-space models, which propagate information through an explicit recurrent...
arxivpapers