Delaunay-Triangulation-Based Learning with Hessian Total-Variation Regularization
M. Pourya, A. Goujon, M. Unser
IEEE Open Journal of Signal Processing, vol. 4, pp. 167–178, February 28, 2023.
Regression is one of the core problems tackled in supervised learning. Neural networks with rectified linear units generate continuous and piecewise-linear (CPWL) mappings and are the state-of-the-art approach for solving regression problems. In this article, we propose an alternative method that leverages the expressivity of CPWL functions. In contrast to deep neural networks, our CPWL parameterization guarantees stability and is interpretable. Our approach relies on the partitioning of the domain of the CPWL function by a Delaunay triangulation. The function values at the vertices of the triangulation are our learnable parameters and identify the CPWL function uniquely. Formulating the learning scheme as a variational problem, we use the Hessian total variation (HTV) as a regularizer to favor CPWL functions with few affine pieces. In this way, we control the complexity of our model through a single hyperparameter. By developing a computational framework to compute the HTV of any CPWL function parameterized by a triangulation, we discretize the learning problem as the generalized least absolute shrinkage and selection operator. Our experiments validate the usage of our method in low-dimensional scenarios.
@ARTICLE(http://bigwww.epfl.ch/publications/pourya2301.html, AUTHOR="Pourya, M. and Goujon, A. and Unser, M.", TITLE="{D}elaunay-Triangulation-Based Learning with {H}essian Total-Variation Regularization", JOURNAL="{IEEE} Open Journal of Signal Processing", YEAR="2023", volume="4", number="", pages="167--178", month="April 28,", note="")