BeClaude
Research2026-05-12

Model Merging Scaling Laws in Large Language Models

Source: Arxiv CS.AI

arXiv:2509.24244v4 Announce Type: replace Abstract: We study empirical scaling laws for language model merging measured by cross-entropy. Despite its wide practical use, merging lacks a quantitative rule that predicts returns as we add experts or scale the model size. We identify a compact power...

arxivpapers