Research2026-04-20
CoMeT: Collaborative Memory Transformer for Efficient Long Context Modeling
Source: Arxiv CS.AI
arXiv:2602.01766v2 Announce Type: replace-cross Abstract: The quadratic complexity and indefinitely growing key-value (KV) cache of standard Transformers pose a major barrier to long-context processing. To overcome this, we introduce the Collaborative Memory Transformer (CoMeT), a novel...
arxivpapers