Beyond Aggregation: Efficient Federated Model Consolidation with Heterogeneity-Adaptive Weights Diffusion

The FedDiff framework

Abstract

As the Internet of Things (IoT) evolves, the need for enhanced data-sharing to improve edge device performance has led to the adoption of Federated Learning (FL) for data privacy and optimized data utilization. However, communication costs in FL remain a significant challenge. Traditional methods focus on client enhancements but overlook server-side aggregation, potentially increasing client computation loads. In response, we introduce a novel method, FedDiff, which utilizes diffusion models for generating model weights on FL servers, replacing traditional aggregation methods. Our approach, tailored for heterogeneous environments, significantly improves communication efficiency, achieving faster convergence and robust performance against weight noise in rigorous tests.

Type
Publication
In 33rd ACM International Conference on Information and Knowledge Management
Click the Cite button above to demo the feature to enable visitors to import publication metadata into their reference management software.