BeClaude
Research2026-05-12

CERSA: Cumulative Energy-Retaining Subspace Adaptation for Memory-Efficient Fine-Tuning

Source: Arxiv CS.AI

arXiv:2605.08174v1 Announce Type: cross Abstract: To mitigate the memory constraints associated with fine-tuning large pre-trained models, existing parameter-efficient fine-tuning (PEFT) methods, such as LoRA, rely on low-rank updates. However, such updates fail to fully capture the rank...

arxivpapersfine-tuning