ON COMPUTATION COMPLEXITY OF HIGH-DIMENSIONAL APPROXIMATION BY DEEP RELU NEURAL NETWORKS
Views: 66 / PDF downloads: 39
Keywords:
Deep ReLU neural network, computation complexity, high-dimensional approximation, H\Abstract
We investigate computation complexity of deep ReLU neural networks for approximating functions in H\"older-Nikol'skii spaces of mixed smoothness $\Lad$ on the unit cube $\IId:=[0,1]^d$. For any function $f\in \Lad$, we explicitly construct nonadaptive and adaptive deep ReLU neural networks having an output that approximates $f$ with a prescribed accuracy $\varepsilon$, and prove dimension-dependent bounds for the computation complexity of this approximation, characterized by the size and depth of this deep ReLU neural network, explicitly in $d$ and $\varepsilon$. Our results show the advantage of the adaptive method of approximation by deep ReLU neural networks over nonadaptive one.