Research2026-05-06
SMoE: An Algorithm-System Co-Design for Pushing MoE to the Edge via Expert Substitution
Source: Arxiv CS.AI
arXiv:2508.18983v3 Announce Type: replace Abstract: The Mixture of Experts (MoE) architecture has emerged as a key technique for scaling Large Language Models by activating only a subset of experts per query. Deploying MoE on consumer-grade edge hardware, however, is constrained by limited device...
arxivpapers