Stable Parameterization of Continuous and Piecewise-Linear Functions
A. Goujon, J. Campos, M. Unser
Applied and Computational Harmonic Analysis, vol. 67, paper no. 101581, 27 p., January 2023.
Rectified-linear-unit (ReLU) neural networks, which play a prominent role in deep learning, generate continuous and piecewise-linear (CPWL) functions. While they provide a powerful parametric representation, the mapping between the parameter and function spaces lacks stability. In this paper, we investigate an alternative representation of CPWL functions that relies on local hat basis functions and that is applicable to low-dimensional regression problems. It is predicated on the fact that any CPWL function can be specified by a triangulation and its values at the grid points. We give the necessary and sufficient condition on the triangulation (in any number of dimensions and with any number of vertices) for the hat functions to form a Riesz basis, which ensures that the link between the parameters and the corresponding CPWL function is stable and unique. In addition, we provide an estimate of the ℓ2 → L2 condition number of this local representation. As a special case of our framework, we focus on a systematic parameterization of ℝd with control points placed on a uniform grid. In particular, we choose hat basis functions that are shifted replicas of a single linear box spline. In this setting, we prove that our general estimate of the condition number is exact. We also relate the local representation to a nonlocal one based on shifts of a causal ReLU-like function. Finally, we indicate how to efficiently estimate the Lipschitz constant of the CPWL mapping.
@ARTICLE(http://bigwww.epfl.ch/publications/goujon2302.html, AUTHOR="Goujon, A. and Campos, J. and Unser, M.", TITLE="Stable Parameterization of Continuous and Piecewise-Linear Functions", JOURNAL="Applied and Computational Harmonic Analysis", YEAR="2023", volume="67", number="", pages="", month="January", note="paper no.\ 101581")