Convex Optimization in Sums of Banach Spaces
M. Unser, S. Aziznejad
Applied and Computational Harmonic Analysis, vol. 56, pp. 1–25, January 2022.
We characterize the solution of a broad class of convex optimization problems that address the reconstruction of a function from a finite number of linear measurements. The underlying hypothesis is that the solution is decomposable as a finite sum of components, where each component belongs to its own prescribed Banach space; moreover, the problem is regularized by penalizing some composite norm of the solution. We establish general conditions for existence and derive the generic parametric representation of the solution components. These representations fall into three categories depending on the underlying regularization norm: (i) a linear expansion in terms of predefined "kernels" when the component space is a reproducing kernel Hilbert space (RKHS), (ii) a non-linear (duality) mapping of a linear combination of measurement functionals when the component Banach space is strictly convex, and, (iii) an adaptive expansion in terms of a small number of atoms within a larger dictionary when the component Banach space is not strictly convex. Our approach generalizes and unifies a number of multi-kernel (RKHS) and sparse-dictionary learning techniques for compressed sensing available in the literature. It also yields the natural extension of the classical spline-fitting techniques in (semi-)RKHS to the abstract Banach-space setting.
@ARTICLE(http://bigwww.epfl.ch/publications/unser2201.html, AUTHOR="Unser, M. and Aziznejad, S.", TITLE="Convex Optimization in Sums of Banach Spaces", JOURNAL="Applied and Computational Harmonic Analysis", YEAR="2022", volume="56", number="", pages="1--25", month="January", note="")