Approximation of Lipschitz Functions Using Deep Spline Neural Networks
S. Neumayer, A. Goujon, P. Bohra, M. Unser
SIAM Journal on Mathematics of Data Science, vol. 5, no. 2, pp. 306–322, 2023.
Although Lipschitz-constrained neural networks have many applications in machine learning, the design and training of expressive Lipschitz-constrained networks is very challenging. Since the popular rectified linear-unit networks have provable disadvantages in this setting, we propose using learnable spline activation functions with at least three linear regions instead. We prove that our choice is universal among all componentwise 1-Lipschitz activation functions in the sense that no other weight-constrained architecture can approximate a larger class of functions. Additionally, our choice is at least as expressive as the recently introduced non-componentwise Groupsort activation function for spectral-norm-constrained weights. The theoretical findings of this paper are consistent with previously published numerical results.
@ARTICLE(http://bigwww.epfl.ch/publications/neumayer2301.html, AUTHOR="Neumayer, S. and Goujon, A. and Bohra, P. and Unser, M.", TITLE="Approximation of {L}ipschitz Functions Using Deep Spline Neural Networks", JOURNAL="{SIAM} Journal on Mathematics of Data Science", YEAR="2023", volume="5", number="2", pages="306--322", month="", note="")