Learning of Continuous and Piecewise-Linear Functions with Hessian Total-Variation Regularization
J. Campos, S. Aziznejad, M. Unser
IEEE Open Journal of Signal Processing, vol. 3, pp. 36–48, December 17, (2021), 2022.
We develop a novel 2D functional learning framework that employs a sparsity-promoting regularization based on second-order derivatives. Motivated by the nature of the regularizer, we restrict the search space to the span of piecewise-linear box splines shifted on a 2D lattice. Our formulation of the infinite-dimensional problem on this search space allows us to recast it exactly as a finite-dimensional one that can be solved using standard methods in convex optimization. Since our search space is composed of continuous and piecewise-linear functions, our work presents itself as an alternative to training networks that deploy rectified linear units, which also construct models in this family. The advantages of our method are fourfold: the ability to enforce sparsity, favoring models with fewer piecewise-linear regions; the use of a rotation, scale and translation-invariant regularization; a single hyperparameter that controls the complexity of the model; and a clear model interpretability that provides a straightforward relation between the parameters and the overall learned function. We validate our framework in various experimental setups and compare it with neural networks.
@ARTICLE(http://bigwww.epfl.ch/publications/campos2201.html, AUTHOR="Campos, J. and Aziznejad, S. and Unser, M.", TITLE="Learning of Continuous and Piecewise-Linear Functions with {H}essian Total-Variation Regularization", JOURNAL="{IEEE} Open Journal of Signal Processing", YEAR="2022", volume="3", number="", pages="36--48", month="December 17, (2021),", note="")