Publication:
Hybrid federated and centralized learning

dc.contributor.coauthorMishra, K.V.
dc.contributor.departmentDepartment of Electrical and Electronics Engineering
dc.contributor.kuauthorElbir, Ahmet Musab
dc.contributor.kuauthorErgen, Sinem Çöleri
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.date.accessioned2024-11-09T13:51:32Z
dc.date.issued2021
dc.description.abstractMany of the machine learning tasks are focused on centralized learning (CL), which requires the transmission of local datasets from the clients to a parameter server (PS) leading to a huge communication overhead. Federated learning (FL) overcomes this issue by allowing the clients to send only the model updates to the PS instead of the whole dataset. In this way, FL brings the learning to edge level, wherein powerful computational resources are required on the client side. This requirement may not always be satisfied because of diverse computational capabilities of edge devices. We address this through a novel hybrid federated and centralized learning (HFCL) framework to effectively train a learning model by exploiting the computational capability of the clients. In HFCL, only the clients who have sufficient resources employ FL; the remaining clients resort to CL by transmitting their local dataset to PS. This allows all the clients to collaborate on the learning process regardless of their computational resources. We also propose a sequential data transmission approach with HFCL (HFCL-SDT) to reduce the training duration. The proposed HFCL frameworks outperform previously proposed non-hybrid FL (CL) based schemes in terms of learning accuracy (communication overhead) since all the clients collaborate on the learning process with their datasets regardless of their computational resources.
dc.description.fulltextYES
dc.description.indexedbyWOS
dc.description.indexedbyScopus
dc.description.openaccessYES
dc.description.publisherscopeInternational
dc.description.sponsoredbyTubitakEuTÜBİTAK
dc.description.sponsorshipEuropean Union (EU)
dc.description.sponsorshipCHIST-ERA
dc.description.sponsorshipScientific and Technological Research Council of Turkey (TÜBİTAK)
dc.description.versionAuthor's final manuscript
dc.identifier.doi10.23919/EUSIPCO54536.2021.9616120
dc.identifier.embargoNO
dc.identifier.filenameinventorynoIR03478
dc.identifier.isbn9789082797060
dc.identifier.issn2219-5491
dc.identifier.quartileN/A
dc.identifier.scopus2-s2.0-85123189353
dc.identifier.urihttps://hdl.handle.net/20.500.14288/3951
dc.identifier.wos764066600307
dc.keywordsCentralized learning
dc.keywordsEdge efficiency
dc.keywordsEdge intelligence
dc.keywordsFederated learning
dc.keywordsMachine learning
dc.language.isoeng
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)
dc.relation.grantnoCHIST-ERA-18-SDCDN-001
dc.relation.grantno119E350.
dc.relation.ispartofEuropean Signal Processing Conference
dc.relation.urihttp://cdm21054.contentdm.oclc.org/cdm/ref/collection/IR/id/10271
dc.subjectDistributed machine learning
dc.subjectFunction computation
dc.subjectFederated learning
dc.titleHybrid federated and centralized learning
dc.typeConference Proceeding
dspace.entity.typePublication
local.contributor.kuauthorErgen, Sinem Çöleri
local.contributor.kuauthorElbir, Ahmet Musab
local.publication.orgunit1College of Engineering
local.publication.orgunit2Department of Electrical and Electronics Engineering
relation.isOrgUnitOfPublication21598063-a7c5-420d-91ba-0cc9b2db0ea0
relation.isOrgUnitOfPublication.latestForDiscovery21598063-a7c5-420d-91ba-0cc9b2db0ea0
relation.isParentOrgUnitOfPublication8e756b23-2d4a-4ce8-b1b3-62c794a8c164
relation.isParentOrgUnitOfPublication.latestForDiscovery8e756b23-2d4a-4ce8-b1b3-62c794a8c164

Files

Original bundle

Now showing 1 - 1 of 1
Thumbnail Image
Name:
10271.pdf
Size:
342.71 KB
Format:
Adobe Portable Document Format