Ajou University repository

Two-stage architectural fine-tuning for neural architecture search in efficient transfer learningoa mark
  • Park, Soohyun ;
  • Son, Seok Bin ;
  • Lee, Youn Kyu ;
  • Jung, Soyi ;
  • Kim, Joongheon
Citations

SCOPUS

1

Citation Export

DC Field Value Language
dc.contributor.authorPark, Soohyun-
dc.contributor.authorSon, Seok Bin-
dc.contributor.authorLee, Youn Kyu-
dc.contributor.authorJung, Soyi-
dc.contributor.authorKim, Joongheon-
dc.date.issued2023-12-01-
dc.identifier.urihttps://dspace.ajou.ac.kr/dev/handle/2018.oak/33853-
dc.description.abstractIn many deep neural network (DNN) applications, the difficulty of gathering high-quality data in industry fields hinders the practical use of DNN. Thus, the concept of transfer learning (TL) has emerged, which leverages the pretrained knowledge of the DNN which was built based on large-scale datasets. For this TL objective, this paper suggests two-stage architectural fine-tuning for reducing the costs and time while exploring the most efficient DNN model, inspired by neural architecture search (NAS). The first stage is mutation, which reduces the search costs using a priori architectural information. Moreover, the next stage is early-stopping, which reduces NAS costs by terminating the search process in the middle of computation. The data-intensive experimental results verify that the proposed method outperforms benchmarks.-
dc.description.sponsorshipThis work was supported in part by the National Research Foundation of Korea (NRF\u2010Korea) under Grant and in part by the Institute of Information and Communications Technology Planning and Evaluation (IITP) Grant through the Korea Government [Ministry of Science and Information and Communications Technology (MSIT)], Intelligent 6G Wireless Access System, under Grant.-
dc.language.isoeng-
dc.publisherJohn Wiley and Sons Inc-
dc.subject.meshFine tuning-
dc.subject.meshHigh quality data-
dc.subject.meshImages processing-
dc.subject.meshNET architecture-
dc.subject.meshNeural architectures-
dc.subject.meshNeural net architecture-
dc.subject.meshNeural network application-
dc.subject.meshPractical use-
dc.subject.meshSearch costs-
dc.subject.meshTransfer learning-
dc.titleTwo-stage architectural fine-tuning for neural architecture search in efficient transfer learning-
dc.typeArticle-
dc.citation.titleElectronics Letters-
dc.citation.volume59-
dc.identifier.bibliographicCitationElectronics Letters, Vol.59-
dc.identifier.doi10.1049/ell2.13066-
dc.identifier.scopusid2-s2.0-85180133284-
dc.identifier.urlhttps://ietresearch.onlinelibrary.wiley.com/loi/1350911x-
dc.subject.keywordimage processing-
dc.subject.keywordneural net architecture-
dc.subject.keywordneural nets-
dc.description.isoatrue-
dc.subject.subareaElectrical and Electronic Engineering-
Show simple item record

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Jung, Soyi Image
Jung, Soyi정소이
Department of Electrical and Computer Engineering
Read More

Total Views & Downloads

File Download

  • There are no files associated with this item.