BeClaude
Research2026-05-01

Why Self-Supervised Encoders Want to Be Normal

Source: Arxiv CS.AI

arXiv:2604.27743v1 Announce Type: cross Abstract: We develop a geometric and information-theoretic framework for encoder-decoder learning built on the Information Bottleneck (IB) principle. Recasting IB as a rate-distortion problem with Kullback-Leibler (KL) divergence as distortion, we show that...

arxivpapers