Multiple-Kernel Regression with Sparsity Constraints
S. Aziznejad, M. Unser
Proceedings of the Workshop on Signal Processing with Adaptive Sparse Structured Representations (SPARS'19), Toulouse, French Republic, July 1-4, 2019, paper no. 103.
We consider the problem of learning a function from a sequence of its noisy samples in a continuous-domain hybrid search space. We adopt the generalized total-variation norm as a sparsity-promoting regularization term to make the problem well-posed. We prove that the solution of this problem admits a sparse kernel expansion with adaptive positions. We also show that the sparsity of the solution is upper-bounded by the number of data points. This allows for an enlargement of the search space and ensures the well-posedness of the problem.
@INPROCEEDINGS(http://bigwww.epfl.ch/publications/aziznejad1903.html, AUTHOR="Aziznejad, S. and Unser, M.", TITLE="Multiple-Kernel Regression with Sparsity Constraints", BOOKTITLE="Proceedings of the Workshop on Signal Processing with Adaptive Sparse Structured Representations ({SPARS'19})", YEAR="2019", editor="", volume="", series="", pages="", address="Toulouse, French Republic", month="July 1-4,", organization="", publisher="", note="paper no.\ 103")
© 2019 INP. Personal use of this material is permitted. However, permission to
reprint/republish this material for advertising or promotional purposes or for creating
new collective works for resale or redistribution to servers or lists, or to reuse any
copyrighted component of this work in other works must be obtained from INP.
This material is presented to ensure timely dissemination of scholarly and technical work.
Copyright and all rights therein are retained by authors or by other copyright holders.
All persons copying this information are expected to adhere to the terms and constraints
invoked by each author's copyright. In most cases, these works may not be reposted without
the explicit permission of the copyright holder.