Research2026-05-11
Dooly: Configuration-Agnostic, Redundancy-Aware Profiling for LLM Inference Simulation
Source: Arxiv CS.AI
arXiv:2605.07985v1 Announce Type: cross Abstract: Selecting the optimal LLM inference configuration requires evaluation across hardware, serving engines, attention backends, and model architectures, since no single choice performs best across all workloads. Profile-based simulators are the standard...
arxivpapers