BeClaude
Research2026-04-22

SEAT: Sparse Entity-Aware Tuning for Knowledge Adaptation while Preserving Epistemic Abstention

Source: Arxiv CS.AI

arXiv:2506.14387v3 Announce Type: replace Abstract: Adapting LLMs with new knowledge is increasingly important, but standard fine-tuning often erodes aligned epistemic abstention: the ability to acknowledge when the model does not know. This failure mode is especially concerning in high-stakes...

arxivpapers