Research2026-05-07
Ortho-Hydra: Orthogonalized Experts for DiT LoRA
Source: Arxiv CS.AI
arXiv:2605.03252v1 Announce Type: cross Abstract: LoRA fine-tuning of diffusion transformers (DiT) on multi-style data suffers from \emph{style bleed}: a single low-rank residual cannot represent several distinct artist fingerprints, and the optimizer converges to their average. Mixture-of-experts...
arxivpapers