Ajou University repository

Embedding Dimensionality Estimation for Autoencoder with Lazy Node Dropout
  • 이재원
Citations

SCOPUS

0

Citation Export

Advisor
신현정
Affiliation
아주대학교 대학원
Department
일반대학원 인공지능학과
Publication Year
2023-02
Publisher
The Graduate School, Ajou University
Keyword
딥러닝오토인코더차원 추정차원 축소
Description
학위논문(석사)--아주대학교 일반대학원 :인공지능학과,2023. 2
Alternative Abstract
Autoencoders are widely used for nonlinear dimension reduction. However, determining the number of nodes in the autoencoder embedding spaces is still a challenging task. The number of nodes in the bottleneck layer, which is an encoded representation, is estimated and determined as a hyperparameter. Therefore, the number of bottleneck nodes are needed to automatically select as an indicator to maintain embedding performance and reduce the complexity of the model. <br>This study proposes a method for automatically estimating the adapted number of nodes in the bottleneck layer for the autoencoder training process. The basic idea of the proposed method is to eliminate lazy nodes rarely affect to the model performance based on the weight distribution of the bottleneck layer. With lazy node dropout, we reduce the number of bottleneck nodes. The following two methods are the main tasks of this paper. The first one is verifying Informative dropout reducing inactive nodes with poorly updated weights, not randomly decreasing the bottleneck nodes. The second one is verifying Online dropout reducing nodes in the online learning process rather than repeating batch learning process. <br>The autoencoder with the number of nodes determined by the proposed method showed better or similar performance in classification accuracy compared to random dropout with online process. Since the proposed method takes place in the learning process of the autoencoder, it has the advantage of accelerating the training speed.
Language
eng
URI
https://dspace.ajou.ac.kr/handle/2018.oak/24584
Fulltext

Type
Thesis
Show full item record

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Total Views & Downloads

File Download

  • There are no files associated with this item.