BeClaude
Research2026-04-30

AdaFRUGAL: Adaptive Memory-Efficient Training with Dynamic Control

Source: Arxiv CS.AI

arXiv:2601.11568v2 Announce Type: replace-cross Abstract: Training Large Language Models (LLMs) is highly memory-intensive due to optimizer state overhead. The FRUGAL framework mitigates this with gradient splitting, but its static hyperparameters -- the subspace ratio ($\rho$) and update frequency...

arxivpapers