What is it about?
This paper introduces OFedED, a novel framework for one-shot federated learning (FL). Unlike traditional FL's multi-round exchanges, OFedED combines dataset distillation—compressing each client's local data into tiny synthetic data that capture key features via layer-wise matching—and server-side model ensembling. Clients train local models, distill data using a global ensemble for guidance, and upload these synthetic data. The server aggregates them, generates more data with a weighted ensemble (to handle class imbalances) and class-center loss (to align features), then trains a global model close to centralized performance. It includes theoretical proofs bounding the gap to ideal centralized training under mild assumptions.
Featured Image
Photo by Mayank Sehgal on Unsplash
Why is it important?
One-shot FL addresses real-world bottlenecks in standard FL, like high communication overhead and security risks from repeated data exchanges, which limit scalability in edge devices (e.g., mobiles, IoT). OFedED achieves near-centralized accuracy while handling data heterogeneity and ensuring differential privacy via noise addition. This enables efficient, robust AI training in privacy-sensitive scenarios like healthcare or finance, without needing raw data sharing, and scales to diverse architectures, making collaborative ML more practical and secure.
Perspectives
From a technical viewpoint, OFedED bridges FL and dataset distillation gaps by proving that even in one communication round distilled data can achieve comparable performance to centralized training, opening doors to hybrid methods blending synthesis with foundation models. Broader implications include democratizing AI access in low-bandwidth settings. Future work could extend to continual learning or integrate with blockchain for trust, potentially accelerating edge AI adoption while raising questions on synthetic data's long-term fidelity versus real-world drift.
xuhui li
Mohamed bin Zayed University of Artificial Intelligence
Read the Original
This page is a summary of: OFedED: One-shot Federated Learning with Model Ensemble and Dataset Distillation, November 2025, ACM (Association for Computing Machinery),
DOI: 10.1145/3746252.3761309.
You can read the full text:
Contributors
The following have contributed to this page







