BeClaude
Research2026-04-28

Loop Corrections to the Training Error and Generalization Gap of Random Feature Models

Source: Arxiv CS.AI

arXiv:2604.12827v2 Announce Type: replace-cross Abstract: We investigate random feature models in which neural networks sampled from a prescribed initialization ensemble are frozen and used as random features, with only the readout weights optimized. Adopting a statistical-physics viewpoint, we...

arxivpapers