Research2026-05-01
When Continual Learning Moves to Memory: A Study of Experience Reuse in LLM Agents
Source: Arxiv CS.AI
arXiv:2604.27003v1 Announce Type: cross Abstract: Memory-augmented LLM agents offer an appealing shortcut to continual learning: rather than updating model parameters, they accumulate experience in external memory, seemingly sidestepping the stability-plasticity dilemma of parametric learning. We...
arxivpapersagents