Sparsity-Driven Statistical Inference for Inverse Problems
U.S. Kamilov
École polytechnique fédérale de Lausanne, EPFL Thesis no. 6545 (2015), 198 p., March 27, 2015.
This thesis addresses statistical inference for the resolution of inverse problems. Our work is motivated by the recent trend whereby classical linear methods are being replaced by nonlinear alternatives that rely on the sparsity of naturally occurring signals. We adopt a statistical perspective and model the signal as a realization of a stochastic process that exhibits sparsity as its central property. Our general strategy for solving inverse problems then lies in the development of novel iterative solutions for performing the statistical estimation.
The thesis is organized in five main parts. In the first part, we provide a general overview of statistical inference in the context of inverse problems. We discuss wavelet-based and gradient-based algorithms for linear and nonlinear forward models. In the second part, we present an in-depth discussion of cycle spinning, which is a technique used to improve the quality of signals recovered with wavelet-based methods. Our main contribution here is its proof of convergence; we also introduce a novel consistent cycle-spinning algorithm for denoising statistical signals. In the third part, we introduce a stochastic signal model based on Lévy processes and investigate popular gradient-based algorithms such as those that deploy total-variation regularization. We develop a novel algorithm based on belief propagation for computing the minimum mean-square error estimator and use it to benchmark several popular methods that recover signals with sparse derivatives. In the fourth part, we propose and analyze a novel adaptive generalized approximate message passing (adaptive GAMP) algorithm that reconstructs signals with independent wavelet-coefficients from generalized linear measurements. Our algorithm is an extension of the standard GAMP algorithm and allows for the joint learning of unknown statistical parameters. We prove that, when the measurement matrix is independent and identically distributed Gaussian, our algorithm is asymptotically consistent. This means that it performs as well as the oracle algorithm, which knows the parameters exactly. In the fifth and final part, we apply our methodology to an inverse problem in optical tomographic microscopy. In particular, we propose a novel nonlinear forward model and a corresponding algorithm for the quantitative estimation of the refractive index distribution of an object.
@PHDTHESIS(http://bigwww.epfl.ch/publications/kamilov1501.html, AUTHOR="Kamilov, U.S.", TITLE="Sparsity-Driven Statistical Inference for Inverse Problems", SCHOOL="{\'{E}}cole polytechnique f{\'{e}}d{\'{e}}rale de {L}ausanne ({EPFL})", YEAR="2015", type="{EPFL} Thesis no.\ 6545 (2015), 198 p.", address="", month="March 27,", note="")