Skip to yearly menu bar Skip to main content


Poster

Approximation with CNNs in Sobolev Space: with Applications to Classification

Guohao Shen · Yuling Jiao · Yuanyuan Lin · Jian Huang

Hall J (level 1) #922

Keywords: [ Neural Networks ] [ Classification ] [ error bound ] [ smooth functions ] [ Approximation ]


Abstract:

We derive a novel approximation error bound with explicit prefactor for Sobolev-regular functions using deep convolutional neural networks (CNNs). The bound is non-asymptotic in terms of the network depth and filter lengths, in a rather flexible way. For Sobolev-regular functions which can be embedded into the H\"older space, the prefactor of our error bound depends on the ambient dimension polynomially instead of exponentially as in most existing results, which is of independent interest. We also establish a new approximation result when the target function is supported on an approximate lower-dimensional manifold. We apply our results to establish non-asymptotic excess risk bounds for classification using CNNs with convex surrogate losses, including the cross-entropy loss, the hinge loss (SVM), the logistic loss, the exponential loss and the least squares loss. We show that the classification methods with CNNs can circumvent the curse of dimensionality if input data is supported on a neighborhood of a low-dimensional manifold.

Chat is not available.