BeClaude
Research2026-05-12

Positional Encoding via Token-Aware Phase Attention

Source: Arxiv CS.AI

arXiv:2509.12635v3 Announce Type: replace-cross Abstract: We prove under practical assumptions that Rotary Positional Embedding (RoPE) introduces an intrinsic distance-dependent bias in attention scores that limits RoPE's ability to model long-context. RoPE extension methods may alleviate this...

arxivpapers