Research2026-04-23
Membership Inference for Contrastive Pre-training Models with Text-only PII Queries
Source: Arxiv CS.AI
arXiv:2603.14222v2 Announce Type: replace-cross Abstract: Contrastive pretraining models such as CLIP and CLAP, serve as the ubiquitous perceptual backbones for modern multimodal large models, yet their reliance on web-scale data raises growing concerns about memorizing Personally Identifiable...
arxivpapers