Research2026-05-11
Structural Rationale Distillation via Reasoning Space Compression
Source: Arxiv CS.AI
arXiv:2605.07139v1 Announce Type: cross Abstract: When distilling reasoning from large language models (LLMs) into smaller ones, teacher rationales for similar problems often vary wildly in structure and strategy. Like a chef who makes the same dish differently each time, this inconsistency burdens...
arxivpapersreasoning