Publication:
FedBWO: Enhancing Communication Efficiency in Federated Learning

dc.conference.locationAbu Dhabi; Marriott Downtown Abu Dhabi
dc.contributor.coauthorHayyolalam, Vahideh (57201271055)
dc.contributor.coauthorÖzkasap, Öznur (6602394621)
dc.date.accessioned2025-12-31T08:21:21Z
dc.date.available2025-12-31
dc.date.issued2025
dc.description.abstractFederated Learning (FL) is a distributed Machine Learning (ML) setup, where a shared model is collaboratively trained by various clients using their local datasets while keeping the data private. Considering resource-constrained devices, FL clients often suffer from restricted transmission capacity. Aiming to enhance the system performance, the communication between clients and server needs to be diminished. Current FL strategies transmit a tremendous amount of data (model weights) within the FL process, which needs a high communication bandwidth. Considering resource constraints, increasing the number of clients and, consequently, the amount of data (model weights) can lead to a bottleneck. In this paper, we introduce the Federated Black Widow Optimization (FedBWO) technique to decrease the amount of transmitted data by transmitting only a performance score rather than the local model weights from clients. FedBWO employs the BWO algorithm to improve local model updates. The conducted experiments prove that FedBWO remarkably improves the performance of the global model and the communication efficiency of the overall system. According to the experimental outcomes, FedBWO enhances the global model accuracy by an average of 21% over FedAvg, and 12% over FedGWO. Furthermore, FedBWO dramatically decreases the communication cost compared to other methods. © 2025 Elsevier B.V., All rights reserved.
dc.description.fulltextYes
dc.description.harvestedfromManual
dc.description.indexedbyScopus
dc.description.publisherscopeInternational
dc.description.readpublishN/A
dc.description.sponsoredbyTubitakEuTÜBİTAK
dc.description.sponsorshipTürkiye Bilimsel ve Teknolojik Araştırma Kurumu, TUBITAK; (121C338)
dc.identifier.doi10.1109/ICHMS65439.2025.11154363
dc.identifier.embargoNo
dc.identifier.endpage268
dc.identifier.isbn9798331521646
dc.identifier.quartileN/A
dc.identifier.scopus2-s2.0-105017769592
dc.identifier.startpage263
dc.identifier.urihttps://doi.org/10.1109/ICHMS65439.2025.11154363
dc.identifier.urihttps://hdl.handle.net/20.500.14288/31578
dc.keywordsArtificial Intelligence
dc.keywordsDistributed Learning
dc.keywordsMeta-Heuristic
dc.keywordsNode selection
dc.keywordsOptimization
dc.language.isoeng
dc.publisherInstitute of Electrical and Electronics Engineers Inc.
dc.relation.affiliationKoç University
dc.relation.collectionKoç University Institutional Repository
dc.relation.ispartof5th IEEE International Conference on Human-Machine Systems, ICHMS 2025
dc.relation.openaccessYes
dc.rightsCC BY-NC-ND (Attribution-NonCommercial-NoDerivs)
dc.rights.urihttps://creativecommons.org/licenses/by-nc-nd/4.0/
dc.titleFedBWO: Enhancing Communication Efficiency in Federated Learning
dc.typeConference Proceeding
dspace.entity.typePublication

Files