Ajou University repository

Unsupervised feature learning for self-tuning neural networksoa mark
Citations

SCOPUS

7

Citation Export

Publication Year
2021-01-01
Publisher
Elsevier Ltd
Citation
Neural Networks, Vol.133, pp.103-111
Keyword
Bagged clusteringRanking violation for triplet samplingSelf-tuning neural networkUnsupervised feature learningUnsupervised transfer learning
Mesh Keyword
Benchmark datasetsClassification accuracyClustering methodsClustering qualityEuclidean distanceSelf-tuning algorithmsTransfer learning methodsUnsupervised feature learningAlgorithmsHumansNeural Networks, ComputerUnsupervised Machine Learning
All Science Classification Codes (ASJC)
Cognitive NeuroscienceArtificial Intelligence
Abstract
In recent years transfer learning has attracted much attention due to its ability to adapt a well-trained model from one domain to another. Fine-tuning is one of the most widely-used methods which exploit a small set of labeled data in the target domain for adapting the network. Including a few methods using the labeled data in the source domain, most transfer learning methods require labeled datasets, and it restricts the use of transfer learning to new domains. In this paper, we propose a fully unsupervised self-tuning algorithm for learning visual features in different domains. The proposed method updates a pre-trained model by minimizing the triplet loss function using only unlabeled data in the target domain. First, we propose the relevance measure for unlabeled data by the bagged clustering method. Then triplets of the anchor, positive, and negative data points are sampled based on the ranking violations of the relevance scores and the Euclidean distances in the embedded feature space. This fully unsupervised self-tuning algorithm improves the performance of the network significantly. We extensively evaluate the proposed algorithm using various metrics, including classification accuracy, feature analysis, and clustering quality, on five benchmark datasets in different domains. Besides, we demonstrate that applying the self-tuning method on the fine-tuned network help achieve better results.
Language
eng
URI
https://dspace.ajou.ac.kr/dev/handle/2018.oak/31652
DOI
https://doi.org/10.1016/j.neunet.2020.10.011
Fulltext

Type
Article
Funding
This work was partly supported by Institute of Information & communications Technology Planning & Evaluation (IITP), Korea grant funded by the Korea government(MSIT) (No. 2020-0-01373 , Artificial Intelligence Graduate School Program(Hanyang University)), Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education, Korea ( NRF-2017R1A6A3A11031193 ), and the NSF CAREER, United States of America Grant #1149783 .
Show full item record

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Ryu, Jongbin Image
Ryu, Jongbin유종빈
Department of Software and Computer Engineering
Read More

Total Views & Downloads

File Download

  • There are no files associated with this item.