Embedding Dimensionality Estimation for Autoencoder with Lazy Node Dropout
DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | 신현정 | - |
dc.contributor.author | 이재원 | - |
dc.date.accessioned | 2025-01-25T01:36:05Z | - |
dc.date.available | 2025-01-25T01:36:05Z | - |
dc.date.issued | 2023-02 | - |
dc.identifier.other | 32611 | - |
dc.identifier.uri | https://dspace.ajou.ac.kr/handle/2018.oak/24584 | - |
dc.description | 학위논문(석사)--아주대학교 일반대학원 :인공지능학과,2023. 2 | - |
dc.description.tableofcontents | 제1장 Introduction 1 <br>제1절 Contributions 4 <br>제2장 Fundamental 6 <br>제1절 Autoencoder 6 <br>제2절 Weight update in neural networks 8 <br>제3절 Dropout and Dropconnect 8 <br>제3장 Proposed Methods 9 <br>제1절 Informative dropout: measurement of node activity 10 <br>제2절 Online dropout: model training 15 <br>제4장 Experiments 22 <br>제1절 Dimensionality estimation 24 <br>제2절 Performance of embedding vectors 28 <br>제3절 Computation time comparison 34 <br>제5장 Conclusion 36 <br>References 37 | - |
dc.language.iso | eng | - |
dc.publisher | The Graduate School, Ajou University | - |
dc.rights | 아주대학교 논문은 저작권에 의해 보호받습니다. | - |
dc.title | Embedding Dimensionality Estimation for Autoencoder with Lazy Node Dropout | - |
dc.type | Thesis | - |
dc.contributor.affiliation | 아주대학교 대학원 | - |
dc.contributor.department | 일반대학원 인공지능학과 | - |
dc.date.awarded | 2023-02 | - |
dc.description.degree | Master | - |
dc.identifier.localId | T000000032611 | - |
dc.identifier.url | https://dcoll.ajou.ac.kr/dcollection/common/orgView/000000032611 | - |
dc.subject.keyword | 딥러닝 | - |
dc.subject.keyword | 오토인코더 | - |
dc.subject.keyword | 차원 추정 | - |
dc.subject.keyword | 차원 축소 | - |
dc.description.alternativeAbstract | Autoencoders are widely used for nonlinear dimension reduction. However, determining the number of nodes in the autoencoder embedding spaces is still a challenging task. The number of nodes in the bottleneck layer, which is an encoded representation, is estimated and determined as a hyperparameter. Therefore, the number of bottleneck nodes are needed to automatically select as an indicator to maintain embedding performance and reduce the complexity of the model. <br>This study proposes a method for automatically estimating the adapted number of nodes in the bottleneck layer for the autoencoder training process. The basic idea of the proposed method is to eliminate lazy nodes rarely affect to the model performance based on the weight distribution of the bottleneck layer. With lazy node dropout, we reduce the number of bottleneck nodes. The following two methods are the main tasks of this paper. The first one is verifying Informative dropout reducing inactive nodes with poorly updated weights, not randomly decreasing the bottleneck nodes. The second one is verifying Online dropout reducing nodes in the online learning process rather than repeating batch learning process. <br>The autoencoder with the number of nodes determined by the proposed method showed better or similar performance in classification accuracy compared to random dropout with online process. Since the proposed method takes place in the learning process of the autoencoder, it has the advantage of accelerating the training speed. | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.