Ajou University repository

MINIMUM WIDTH FOR UNIVERSAL APPROXIMATION USING RELU NETWORKS ON COMPACT DOMAIN
Citations

SCOPUS

0

Citation Export

Publication Year
2024-01-01
Journal
12th International Conference on Learning Representations, ICLR 2024
Publisher
International Conference on Learning Representations, ICLR
Citation
12th International Conference on Learning Representations, ICLR 2024
Mesh Keyword
Activation functionsInput-outputLow boundMin-maxp-ApproximationP-functionUniform approximationUniversal approximationUniversal approximation propertiesUniversal approximators
All Science Classification Codes (ASJC)
Language and LinguisticsComputer Science ApplicationsEducationLinguistics and Language
Abstract
It has been shown that deep neural networks of a large enough width are universal approximators but they are not if the width is too small. There were several attempts to characterize the minimum width wmin enabling the universal approximation property; however, only a few of them found the exact values. In this work, we show that the minimum width for Lp approximation of Lp functions from [0, 1]dx to ℝdy is exactly max{dx, dy, 2} if an activation function is RELU-LIKE (e.g., RELU, GELU, SOFTPLUS). Compared to the known result for RELU networks, wmin = max{dx + 1, dy} when the domain is ℝdx, our result first shows that approximation on a compact domain requires smaller width than on Rdx. We next prove a lower bound on wmin for uniform approximation using general activation functions including RELU: wmin ≥ dy + 1 if dx < dy ≤ 2dx. Together with our first result, this shows a dichotomy between Lp and uniform approximations for general activation functions and input/output dimensions.
Language
eng
URI
https://aurora.ajou.ac.kr/handle/2018.oak/37103
https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85195527806&origin=inward
Type
Conference
Funding
NK and SP were supported by Institute of Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT) (No. 2019-0-00079, Artificial Intelligence Graduate School Program, Korea University) and Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (2022R1F1A1076180).
Show full item record

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Min, Chan Ho Image
Min, Chan Ho민찬호
Department of Financial Engineering
Read More

Total Views & Downloads

File Download

  • There are no files associated with this item.