Not All ℓp-Norms Are Compatible with Sparse Stochastic Processes
A. Amini, M. Unser
Signal Processing with Adaptive Sparse Structured Representations (SPARS'13), Lausanne VD, Swiss Confederation, July 8-11, 2013.
We adopt the framework of sparse stochastic processes of [1] and investigate the sparse/compressible priors obtained by linearly measuring the processes. We show such priors are necessarily infinitely divisible. This property is satisfied by many priors used in statistical learning such as Gaussian, Laplace, and a wide range of fat-tailed distributions such as Student's-t and α-stable laws. However, it excludes some popular priors used in compressed sensing including all distributions that decay like exp(−O(|x|p)) for 1 < p < 2. This fact can be considered as evidence against the usage of ℓp-norms for 1 < p < 2 in regularization techniques involving sparse priors.
Reference
-
M. Unser, P. Tafti, Q. Sun, "A Unified Formulation of Gaussian vs. Sparse Stochastic Processes—Part I: Continuous-Domain Theory," arXiv:1108.6150v1 [cs.IT]
@INPROCEEDINGS(http://bigwww.epfl.ch/publications/amini1304.html, AUTHOR="Amini, A. and Unser, M.", TITLE="Not All $\ell_{p}$-Norms Are Compatible with Sparse Stochastic Processes", BOOKTITLE="Signal Processing with Adaptive Sparse Structured Representations ({SPARS'13})", YEAR="2013", editor="", volume="", series="", pages="", address="Lausanne VD, Swiss Confederation", month="July 8-11,", organization="", publisher="", note="")