BeClaude
Research2026-04-22

One Step Forward and K Steps Back: Better Reasoning with Denoising Recursion Models

Source: Arxiv CS.AI

arXiv:2604.18839v1 Announce Type: cross Abstract: Looped transformers scale computational depth without increasing parameter count by repeatedly applying a shared transformer block and can be used for iterative refinement, where each loop rewrites a full fixed-size prediction in parallel. On...

arxivpapersreasoning