Ajou University repository

AMBLE: Adjusting Mini-Batch and Local Epoch Adaptively for Federated Learning with Serverless Parameter Server
  • 박주원
Citations

SCOPUS

0

Citation Export

Advisor
오상윤
Affiliation
아주대학교 일반대학원
Department
일반대학원 인공지능학과
Publication Year
2021-08
Publisher
The Graduate School, Ajou University
Keyword
Federated LearningLocal SGDServerless ComputingSystem Heterogeneity
Description
학위논문(석사)--아주대학교 일반대학원 :인공지능학과,2021. 8
Alternative Abstract
As data privacy becomes increasingly important, federated learning is a technique for training deep learning models while ensuring the data privacy of devices. Because federated learning updates the global model through a centralized server, it is challenging to reduce communication overhead. It is also challenging to consider the system heterogeneity of devices. In this paper, we present a new architecture for federated learning with a serverless parameter server. Because gradients and global model updates in federated learning are event-driven, a serverless environment can be utilized to decouple the synchronization process of the parameter server from the device’s model communication process. In addition, we propose AMBLE, which adaptively adjusts the local mini-batch and local epoch size for heterogeneous devices in federated learning, synchronously updating the parameters. Our proposed scheme, AMBLE, can increase computation during the waiting time caused by stragglers and can scale the local learning rate to improve the model convergence rate and accuracy. We confirm that federated learning with AMBLE can be stably trained with a faster convergence speed and higher accuracy than FedAvg, in both the non-IID and IID cases.
Language
eng
URI
https://dspace.ajou.ac.kr/handle/2018.oak/20400
Fulltext

Type
Thesis
Show full item record

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Total Views & Downloads

File Download

  • There are no files associated with this item.