Research2026-04-22
Where Fake Citations Are Made: Tracing Field-Level Hallucination to Specific Neurons in LLMs
Source: Arxiv CS.AI
arXiv:2604.18880v1 Announce Type: cross Abstract: LLMs frequently generate fictitious yet convincing citations, often expressing high confidence even when the underlying reference is wrong. We study this failure across 9 models and 108{,}000 generated references, and find that author names fail far...
arxivpapers