Ajou University repository

Federated Learning with Pareto Optimality for Resource Efficiency and Fast Model Convergence in Mobile Environments †oa mark
Citations

SCOPUS

2

Citation Export

DC Field Value Language
dc.contributor.authorJung, June Pyo-
dc.contributor.authorKo, Young Bae-
dc.contributor.authorLim, Sung Hwa-
dc.date.issued2024-04-01-
dc.identifier.issn1424-8220-
dc.identifier.urihttps://dspace.ajou.ac.kr/dev/handle/2018.oak/34162-
dc.description.abstractFederated learning (FL) is an emerging distributed learning technique through which models can be trained using the data collected by user devices in resource-constrained situations while protecting user privacy. However, FL has three main limitations: First, the parameter server (PS), which aggregates the local models that are trained using local user data, is typically far from users. The large distance may burden the path links between the PS and local nodes, thereby increasing the consumption of the network and computing resources. Second, user device resources are limited, but this aspect is not considered in the training of the local model and transmission of the model parameters. Third, the PS-side links tend to become highly loaded as the number of participating clients increases. The links become congested owing to the large size of model parameters. In this study, we propose a resource-efficient FL scheme. We follow the Pareto optimality concept with the biased client selection to limit client participation, thereby ensuring efficient resource consumption and rapid model convergence. In addition, we propose a hierarchical structure with location-based clustering for device-to-device communication using k-means clustering. Simulation results show that with prate at 0.75, the proposed scheme effectively reduced transmitted and received network traffic by 75.89% and 78.77%, respectively, compared to the FedAvg method. It also achieves faster model convergence compared to other FL mechanisms, such as FedAvg and D2D-FedAvg.-
dc.description.sponsorshipThis research was funded by the National Research Foundation of Korea (NRF) grant funded by the Ministry of Science and ICT (MSIT) (NRF-2020R1A2C1102284 & NRF-2021R1A2C1012776).-
dc.language.isoeng-
dc.publisherMultidisciplinary Digital Publishing Institute (MDPI)-
dc.subject.meshEfficiency models-
dc.subject.meshFAST model-
dc.subject.meshFederated learning-
dc.subject.meshLocal model-
dc.subject.meshMobile communications-
dc.subject.meshModel convergence-
dc.subject.meshModeling parameters-
dc.subject.meshPareto-optimality-
dc.subject.meshResource efficiencies-
dc.subject.meshUser devices-
dc.titleFederated Learning with Pareto Optimality for Resource Efficiency and Fast Model Convergence in Mobile Environments †-
dc.typeArticle-
dc.citation.titleSensors-
dc.citation.volume24-
dc.identifier.bibliographicCitationSensors, Vol.24-
dc.identifier.doi10.3390/s24082476-
dc.identifier.pmid38676094-
dc.identifier.scopusid2-s2.0-85191409670-
dc.identifier.urlhttp://www.mdpi.com/journal/sensors-
dc.subject.keywordfederated learning-
dc.subject.keywordmobile communication-
dc.subject.keywordPareto optimality-
dc.description.isoatrue-
dc.subject.subareaAnalytical Chemistry-
dc.subject.subareaInformation Systems-
dc.subject.subareaAtomic and Molecular Physics, and Optics-
dc.subject.subareaBiochemistry-
dc.subject.subareaInstrumentation-
dc.subject.subareaElectrical and Electronic Engineering-
Show simple item record

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Ko, Young-Bae Image
Ko, Young-Bae고영배
Department of Software and Computer Engineering
Read More

Total Views & Downloads

File Download

  • There are no files associated with this item.