Communication-Efficient Federated Learning: A Systematic Review of Model Compression and Aggregation Techniques

Authors

  • Femi Temitope Johnson Department of Computer Science, Federal University of Agriculture, Nigeria https://orcid.org/0000-0002-5467-855X
  • Elugbadebo Oladapo Computer Science Department, Federal College of Education, Nigeria
  • Olukumoro Olugbenga Department of Computer Science, Yaba College of Technology, Nigeria https://orcid.org/0000-0002-0060-2613
  • Alomaja Victor Department of Computer Science, Yaba College of Technology, Nigeria
  • Akande Adenike Computer Science Department, Federal College of Education, Nigeria

DOI:

https://doi.org/10.47852/bonviewAIA52026275

Keywords:

federated learning, communication efficiency, model compression, gradient aggregation, edge computing

Abstract

Federated Learning (FL) has emerged as a transformative paradigm for decentralized machine learning, enabling collaborative model training across distributed devices while preserving data privacy. However, high communication overhead during model updates remains a major obstacle to real-world deployment. This paper presents a systematic literature review aimed at identifying and evaluating communication-efficient techniques in FL, with a focus on model compression and aggregation methods. Using a PRISMA-based framework, we reviewed 65 peer-reviewed studies published between 2018 and 2024 from IEEE Xplore, ACM Digital Library, and Scopus. The analysis reveals key contrasts between research contexts: 92% of industry-led studies emphasize reproducibility and deployment readiness, compared to 61% of academic publications. Among the most effective methods, quantization techniques offer 40–70% bandwidth savings, while advanced aggregation strategies like FedProx and SCAFFOLD enhance performance under data heterogeneity. Despite these advances, fundamental trade-offs remain between compression ratio, convergence speed, and model accuracy. Additionally, reproducibility is hindered by inconsistent benchmarks and limited open-source tools. Gaps in energy-efficient protocols, sustainability metrics, and cross-domain adaptability also constrain deployment in edge and IoT environments. This review provides a foundation for developing scalable, reproducible, efficient and communication-aware FL systems for practical applications.



Received: 27 May 2025 | Revised: 8 August 2025 | Accepted: 17 October 2025

 

Conflicts of Interest

The authors declare that they have no conflicts of interest to this work.

 

Data Availability Statement

Data sharing is not applicable to this article as no new data were created or analyzed in this study.

 

Author Contribution Statement

Femi Temitope Johnson: Conceptualization, Methodology, Software, Validation, Formal analysis, Investigation, Resources, Data curation, Writing – original draft, Writing – review & editing, Visualization, Supervision, Project administration. Elugbadebo Oladapo: Methodology, Validation, Formal analysis, Investigation, Resources, Data curation, Writing – original draft, Visualization, Supervision. Olukumoro Olugbenga: Software, Formal analysis, Resources, Data curation, Visualization, Supervision, Project administration. Alomaja Victor: Validation, Investigation, Resources, Data curation, Writing – review & editing, Visualization, Supervision. Akande Adenike: Methodology, Software, Resources, Data curation, Writing – original draft, Writing – review & editing, Visualization, Supervision, Project administration.


Metrics

Metrics Loading ...

Downloads

Published

2025-11-15

Issue

Section

Review

How to Cite

Johnson, F. T., Oladapo, E., Olugbenga, O., Victor, A., & Adenike, A. (2025). Communication-Efficient Federated Learning: A Systematic Review of Model Compression and Aggregation Techniques. Artificial Intelligence and Applications. https://doi.org/10.47852/bonviewAIA52026275