Ajou University repository

Crossover-SGD: A gossip-based communication in distributed deep learning for alleviating large mini-batch problem and enhancing scalabilityoa mark
  • Yeo, Sangho ;
  • Bae, Minho ;
  • Jeong, Minjoong ;
  • Kwon, Oh Kyoung ;
  • Oh, Sangyoon
Citations

SCOPUS

1

Citation Export

DC Field Value Language
dc.contributor.authorYeo, Sangho-
dc.contributor.authorBae, Minho-
dc.contributor.authorJeong, Minjoong-
dc.contributor.authorKwon, Oh Kyoung-
dc.contributor.authorOh, Sangyoon-
dc.date.issued2023-07-10-
dc.identifier.urihttps://dspace.ajou.ac.kr/dev/handle/2018.oak/33134-
dc.description.abstractDistributed deep learning is an effective way to reduce the training time for large datasets as well as complex models. However, the limited scalability caused by network-overheads makes it difficult to synchronize the parameters of all workers and gossip-based methods that demonstrate stable scalability regardless of the number of workers have been proposed. However, to use gossip-based methods in general cases, the validation accuracy for a large mini-batch needs to be verified. For this, we first empirically study the characteristics of gossip methods in a large mini-batch problem and observe that gossip methods preserve higher validation accuracy than AllReduce-SGD (stochastic gradient descent) when the number of batch sizes is increased, and the number of workers is fixed. However, the delayed parameter propagation of the gossip-based models decreases validation accuracy in large node scales. To cope with this problem, we propose Crossover-SGD that alleviates the delay propagation of weight parameters via segment-wise communication and random network topology with fair peer selection. We also adapt hierarchical communication to limit the number of workers in gossip-based communication methods. To validate the effectiveness of our method, we conduct empirical experiments and observe that our Crossover-SGD shows higher node scalability than stochastic gradient push.-
dc.description.sponsorshipElectronics and Telecommunications Research Institute, Grant/Award Number: 20ZT1100; Institute for Information and Communications Technology Promotion, Grant/Award Number: IITP‐2020‐2018‐0‐01431; Korea Institute of Science and Technology Information, Grant/Award Number: KSC‐2019‐CRE‐0105; National Research Foundation of Korea, Grant/Award Number: 2021R1F1A1062779 Funding information-
dc.description.sponsorshipThis research was supported by National Supercomputing Center with supercomputing resources including technical support (KSC‐2019‐CRE‐0105), the MSIT (Ministry of Science and ICT), Korea, under the ITRC (Information Technology Research Center) support program (IITP‐2020‐2018‐0‐01431) supervised by the IITP (Institute for Information & communications Technology Promotion), Basic Science Research Program Through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (2021R1F1A1062779), and Electronics and Telecommunications Research Institute (ETRI) Grant funded by the Korean government (20ZT1100, Development of ICT Convergence Technology based on Urban Area). We would like to thank Editage ( https://www.editage.co.kr ) for English language editing.-
dc.language.isoeng-
dc.publisherJohn Wiley and Sons Ltd-
dc.subject.meshComplex model-
dc.subject.meshDistributed deep learning-
dc.subject.meshGossip based-
dc.subject.meshHierarchical communications-
dc.subject.meshLarge datasets-
dc.subject.meshLarge mini-batch problem-
dc.subject.meshNetwork overhead-
dc.subject.meshSegment-wise-
dc.subject.meshTraining time-
dc.subject.meshWorkers'-
dc.titleCrossover-SGD: A gossip-based communication in distributed deep learning for alleviating large mini-batch problem and enhancing scalability-
dc.typeConference Paper-
dc.citation.titleConcurrency and Computation: Practice and Experience-
dc.citation.volume35-
dc.identifier.bibliographicCitationConcurrency and Computation: Practice and Experience, Vol.35-
dc.identifier.doi10.1002/cpe.7508-
dc.identifier.scopusid2-s2.0-85144312956-
dc.identifier.urlhttp://onlinelibrary.wiley.com/journal/10.1002/(ISSN)1532-0634-
dc.subject.keyworddistributed deep learning-
dc.subject.keywordgossip based-
dc.subject.keywordhierarchical communication-
dc.subject.keywordlarge mini-batch problem-
dc.subject.keywordsegment-wise-
dc.description.isoatrue-
dc.subject.subareaSoftware-
dc.subject.subareaTheoretical Computer Science-
dc.subject.subareaComputer Science Applications-
dc.subject.subareaComputer Networks and Communications-
dc.subject.subareaComputational Theory and Mathematics-
Show simple item record

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Oh, Sangyoon Image
Oh, Sangyoon오상윤
Department of Software and Computer Engineering
Read More

Total Views & Downloads

File Download

  • There are no files associated with this item.