Communication-Efficient Federated Learning: A Systematic Review of Model Compression and Aggregation Techniques
DOI:
https://doi.org/10.47852/bonviewAIA52026275Keywords:
federated learning, communication efficiency, model compression, gradient aggregation, edge computingAbstract
Federated Learning (FL) has emerged as a transformative paradigm for decentralized machine learning, enabling collaborative model training across distributed devices while preserving data privacy. However, high communication overhead during model updates remains a major obstacle to real-world deployment. This paper presents a systematic literature review aimed at identifying and evaluating communication-efficient techniques in FL, with a focus on model compression and aggregation methods. Using a PRISMA-based framework, we reviewed 65 peer-reviewed studies published between 2018 and 2024 from IEEE Xplore, ACM Digital Library, and Scopus. The analysis reveals key contrasts between research contexts: 92% of industry-led studies emphasize reproducibility and deployment readiness, compared to 61% of academic publications. Among the most effective methods, quantization techniques offer 40–70% bandwidth savings, while advanced aggregation strategies like FedProx and SCAFFOLD enhance performance under data heterogeneity. Despite these advances, fundamental trade-offs remain between compression ratio, convergence speed, and model accuracy. Additionally, reproducibility is hindered by inconsistent benchmarks and limited open-source tools. Gaps in energy-efficient protocols, sustainability metrics, and cross-domain adaptability also constrain deployment in edge and IoT environments. This review provides a foundation for developing scalable, reproducible, efficient and communication-aware FL systems for practical applications.
Received: 27 May 2025 | Revised: 8 August 2025 | Accepted: 17 October 2025
Conflicts of Interest
The authors declare that they have no conflicts of interest to this work.
Data Availability Statement
Data sharing is not applicable to this article as no new data were created or analyzed in this study.
Author Contribution Statement
Femi Temitope Johnson: Conceptualization, Methodology, Software, Validation, Formal analysis, Investigation, Resources, Data curation, Writing – original draft, Writing – review & editing, Visualization, Supervision, Project administration. Elugbadebo Oladapo: Methodology, Validation, Formal analysis, Investigation, Resources, Data curation, Writing – original draft, Visualization, Supervision. Olukumoro Olugbenga: Software, Formal analysis, Resources, Data curation, Visualization, Supervision, Project administration. Alomaja Victor: Validation, Investigation, Resources, Data curation, Writing – review & editing, Visualization, Supervision. Akande Adenike: Methodology, Software, Resources, Data curation, Writing – original draft, Writing – review & editing, Visualization, Supervision, Project administration.
Metrics
Downloads
Published
Issue
Section
License
Copyright (c) 2025 Authors

This work is licensed under a Creative Commons Attribution 4.0 International License.