BeClaude
Research2026-05-12

AdaPreLoRA: Adafactor Preconditioned Low-Rank Adaptation

Source: Arxiv CS.AI

arXiv:2605.08734v1 Announce Type: cross Abstract: Low-Rank Adaptation (LoRA) reparameterizes a weight update as a product of two low-rank factors, but the Jacobian $J_{G}$ of the generator mapping the factors to the weight matrix is rank-deficient, so the factor-space preconditioner $J_{G}^* {F}_t...

arxivpapers