Research2026-05-08
Budgeted Attention Allocation: Cost-Conditioned Compute Control for Efficient Transformers
Source: Arxiv CS.AI
arXiv:2605.05697v1 Announce Type: cross Abstract: Transformers usually expose one inference cost per trained model, while deployed systems often need multiple cost-quality operating points. We study Budgeted Attention Allocation, a monotone head-gating mechanism conditioned on a requested attention...
arxivpapers