Research2026-05-12
Continuous Latent Contexts Enable Efficient Online Learning in Transformers
Source: Arxiv CS.AI
arXiv:2605.09867v1 Announce Type: cross Abstract: Large language models (LLMs) exhibit a strong capacity for in-context learning: Given labeled examples, they can generate good predictions without parameter updates. However, many interactive settings go beyond static prediction to online...
arxivpapers