Research2026-04-20
(1D) Ordered Tokens Enable Efficient Test-Time Search
Source: Arxiv CS.AI
arXiv:2604.15453v1 Announce Type: cross Abstract: Tokenization is a key component of autoregressive (AR) generative models, converting raw data into more manageable units for modeling. Commonly, tokens describe local information, such as regions of pixels in images or word pieces in text, and AR...
arxivpapers