Research2026-05-08
The Structural Origin of Attention Sink: Variance Discrepancy, Super Neurons, and Dimension Disparity
Source: Arxiv CS.AI
arXiv:2605.06611v1 Announce Type: cross Abstract: Despite the prevalence of the attention sink phenomenon in Large Language Models (LLMs), where initial tokens disproportionately monopolize attention scores, its structural origins remain elusive. This work provides a \textit{mechanistic...
arxivpapers