BeClaude
Research2026-04-30

Awakening Dormant Experts:Counterfactual Routing to Mitigate MoE Hallucinations

Source: Arxiv CS.AI

arXiv:2604.14246v2 Announce Type: replace-cross Abstract: Sparse Mixture-of-Experts (MoE) models have achieved remarkable scalability, yet they remain vulnerable to hallucinations, particularly when processing long-tail knowledge. We identify that this fragility stems from static Top-$k$ routing:...

arxivpapers