ON COMPUTATION COMPLEXITY OF HIGH-DIMENSIONAL APPROXIMATION BY DEEP RELU NEURAL NETWORKS


Views: 66 / PDF downloads: 39

Authors

  • D. D˜ung Vietnam National University
  • V.K. Nguyen University of Transport and Communications
  • M.X. Thao Hong Duc University

Keywords:

Deep ReLU neural network, computation complexity, high-dimensional approximation, H\

Abstract

We investigate computation complexity of deep ReLU neural networks for approximating functions in H\"older-Nikol'skii spaces of mixed smoothness $\Lad$ on the unit cube $\IId:=[0,1]^d$. For any function $f\in \Lad$, we explicitly construct  nonadaptive and adaptive deep ReLU neural networks having an output that  approximates $f$  with a prescribed accuracy $\varepsilon$, and prove dimension-dependent bounds for the computation complexity of this approximation, characterized by   the size and  depth of  this deep ReLU neural network, explicitly in $d$ and $\varepsilon$.  Our results show the advantage of the adaptive method of approximation by deep ReLU neural networks over nonadaptive one.

Downloads

Published

2020-12-30

How to Cite

D˜ung, D., Nguyen, V., & Thao, M. . (2020). ON COMPUTATION COMPLEXITY OF HIGH-DIMENSIONAL APPROXIMATION BY DEEP RELU NEURAL NETWORKS. Bulletin of L.N. Gumilyov Eurasian National University. Mathematics, Computer Science, Mechanics Series, 133(4), 8–18. Retrieved from https://bulmathmc.enu.kz/index.php/main/article/view/181

Issue

Section

Статьи