BeClaude
Research2026-04-27

MambaCSP: Hybrid-Attention State Space Models for Hardware-Efficient Channel State Prediction

Source: Arxiv CS.AI

arXiv:2604.21957v1 Announce Type: cross Abstract: Recent works have demonstrated that attention-based transformer and large language model (LLM) architectures can achieve strong channel state prediction (CSP) performance by capturing long-range temporal dependencies across channel state information...

arxivpapers