Research2026-05-12
Rethinking Random Transformers as Adaptive Sequence Smoothers for Sleep Staging
Source: Arxiv CS.AI
arXiv:2605.09905v1 Announce Type: cross Abstract: Automatic sleep staging commonly adopts Transformers under the assumption that they learn complex long-range dependencies. We challenge this view by revealing a neglected property of sleep sequences: strong local temporal continuity. We show that a...
arxivpapers