BeClaude
Research2026-05-07

DMGD: Train-Free Dataset Distillation with Semantic-Distribution Matching in Diffusion Models

Source: Arxiv CS.AI

arXiv:2605.03877v1 Announce Type: cross Abstract: Dataset distillation enables efficient training by distilling the information of large-scale datasets into significantly smaller synthetic datasets. Diffusion based paradigms have emerged in recent years, offering novel perspectives for dataset...

arxivpapersimage-generation