Embedding Dimensionality Estimation for Autoencoder with Lazy Node Dropout

DC Field Value Language
dc.contributor.advisor신현정-
dc.contributor.author이재원-
dc.date.accessioned2025-01-25T01:36:05Z-
dc.date.available2025-01-25T01:36:05Z-
dc.date.issued2023-02-
dc.identifier.other32611-
dc.identifier.urihttps://dspace.ajou.ac.kr/handle/2018.oak/24584-
dc.description학위논문(석사)--아주대학교 일반대학원 :인공지능학과,2023. 2-
dc.description.tableofcontents제1장 Introduction 1 <br>제1절 Contributions 4 <br>제2장 Fundamental 6 <br>제1절 Autoencoder 6 <br>제2절 Weight update in neural networks 8 <br>제3절 Dropout and Dropconnect 8 <br>제3장 Proposed Methods 9 <br>제1절 Informative dropout: measurement of node activity 10 <br>제2절 Online dropout: model training 15 <br>제4장 Experiments 22 <br>제1절 Dimensionality estimation 24 <br>제2절 Performance of embedding vectors 28 <br>제3절 Computation time comparison 34 <br>제5장 Conclusion 36 <br>References 37-
dc.language.isoeng-
dc.publisherThe Graduate School, Ajou University-
dc.rights아주대학교 논문은 저작권에 의해 보호받습니다.-
dc.titleEmbedding Dimensionality Estimation for Autoencoder with Lazy Node Dropout-
dc.typeThesis-
dc.contributor.affiliation아주대학교 대학원-
dc.contributor.department일반대학원 인공지능학과-
dc.date.awarded2023-02-
dc.description.degreeMaster-
dc.identifier.localIdT000000032611-
dc.identifier.urlhttps://dcoll.ajou.ac.kr/dcollection/common/orgView/000000032611-
dc.subject.keyword딥러닝-
dc.subject.keyword오토인코더-
dc.subject.keyword차원 추정-
dc.subject.keyword차원 축소-
dc.description.alternativeAbstractAutoencoders are widely used for nonlinear dimension reduction. However, determining the number of nodes in the autoencoder embedding spaces is still a challenging task. The number of nodes in the bottleneck layer, which is an encoded representation, is estimated and determined as a hyperparameter. Therefore, the number of bottleneck nodes are needed to automatically select as an indicator to maintain embedding performance and reduce the complexity of the model. <br>This study proposes a method for automatically estimating the adapted number of nodes in the bottleneck layer for the autoencoder training process. The basic idea of the proposed method is to eliminate lazy nodes rarely affect to the model performance based on the weight distribution of the bottleneck layer. With lazy node dropout, we reduce the number of bottleneck nodes. The following two methods are the main tasks of this paper. The first one is verifying Informative dropout reducing inactive nodes with poorly updated weights, not randomly decreasing the bottleneck nodes. The second one is verifying Online dropout reducing nodes in the online learning process rather than repeating batch learning process. <br>The autoencoder with the number of nodes determined by the proposed method showed better or similar performance in classification accuracy compared to random dropout with online process. Since the proposed method takes place in the learning process of the autoencoder, it has the advantage of accelerating the training speed.-
Appears in Collections:
Graduate School of Ajou University > Department of Artificial Intelligence > 3. Theses(Master)
Files in This Item:
There are no files associated with this item.

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Browse