EPFL
 Biomedical Imaging GroupSTI
EPFL
  Publications
English only   BIG > Publications > Inverse Problems


 CONTENTS
 Home Page
 News & Events
 People
 Publications
 Tutorials and Reviews
 Research
 Demos
 Download Algorithms

 DOWNLOAD
 PDF
 Postscript
 All BibTeX References

Sparsity-Driven Statistical Inference for Inverse Problems

U.S. Kamilov

Swiss Federal Institute of Technology Lausanne, EPFL Thesis no. 6545 (2015), 198 p., March 27, 2015.



kamilov1501fig01

This thesis addresses statistical inference for the resolution of inverse problems. Our work is motivated by the recent trend whereby classical linear methods are being replaced by nonlinear alternatives that rely on the sparsity of naturally occurring signals. We adopt a statistical perspective and model the signal as a realization of a stochastic process that exhibits sparsity as its central property. Our general strategy for solving inverse problems then lies in the development of novel iterative solutions for performing the statistical estimation.

The thesis is organized in five main parts. In the first part, we provide a general overview of statistical inference in the context of inverse problems. We discuss wavelet-based and gradient-based algorithms for linear and nonlinear forward models. In the second part, we present an in-depth discussion of cycle spinning, which is a technique used to improve the quality of signals recovered with wavelet-based methods. Our main contribution here is its proof of convergence; we also introduce a novel consistent cycle-spinning algorithm for denoising statistical signals. In the third part, we introduce a stochastic signal model based on Lévy processes and investigate popular gradient-based algorithms such as those that deploy total-variation regularization. We develop a novel algorithm based on belief propagation for computing the minimum mean-square error estimator and use it to benchmark several popular methods that recover signals with sparse derivatives. In the fourth part, we propose and analyze a novel adaptive generalized approximate message passing (adaptive GAMP) algorithm that reconstructs signals with independent wavelet-coefficients from generalized linear measurements. Our algorithm is an extension of the standard GAMP algorithm and allows for the joint learning of unknown statistical parameters. We prove that, when the measurement matrix is independent and identically distributed Gaussian, our algorithm is asymptotically consistent. This means that it performs as well as the oracle algorithm, which knows the parameters exactly. In the fifth and final part, we apply our methodology to an inverse problem in optical tomographic microscopy. In particular, we propose a novel nonlinear forward model and a corresponding algorithm for the quantitative estimation of the refractive index distribution of an object.


@PHDTHESIS(http://bigwww.epfl.ch/publications/kamilov1501.html,
AUTHOR="Kamilov, U.S.",
TITLE="Sparsity-Driven Statistical Inference for Inverse Problems",
SCHOOL="{S}wiss {F}ederal {I}nstitute of {T}echnology {L}ausanne
        ({EPFL})",
YEAR="2015",
type="{EPFL} Thesis no.\ 6545 (2015), 198 p.",
address="",
month="March 27,",
note="")

© 2015 U.S. Kamilov. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from U.S. Kamilov.
This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.