CMX Lunch Seminar

Wednesday February 17, 2021 12:00 PM

Approximation Theory and Metric Entropy of Neural Networks

Speaker: Jonathan Siegel, Department of Mathematics, Penn State
Location: Online Event

We consider the problem of approximating high dimensional functions using shallow neural networks. We begin by introducing natural spaces of functions which can be efficiently approximated by such networks. Then, we derive the metric entropy of the unit balls in these spaces. Drawing upon recent work connecting stable approximation rates to metric entropy, this leads to the optimal approximation rates for the given spaces. Next, we show that higher approximation rates can be obtained by further restricting the function class. In particular, for a restrictive but natural space of functions, shallow networks with ReLU$^k$ activation function achieve an approximation rate of $O(n^{-(k+1)})$ in every dimension. Finally, we discuss the connections between this surprising result and the finite element method.

Series CMX Lunch Series

Contact: Jolene Brink at 6263952813 jbrink@caltech.edu
For more information visit: http://cmx.caltech.edu/