Biomedical Imaging Group
Logo EPFL
    • Splines Tutorials
    • Splines Art Gallery
    • Wavelets Tutorials
    • Image denoising
    • ERC project: FUN-SP
    • Sparse Processes - Book Preview
    • ERC project: GlobalBioIm
    • The colored revolution of bioimaging
    • Deconvolution
    • SMLM
    • One-World Seminars: Representer theorems
    • A Unifying Representer Theorem
Follow us on Twitter.
Join our Github.
Masquer le formulaire de recherche
Menu
BIOMEDICAL IMAGING GROUP (BIG)
Laboratoire d'imagerie biomédicale (LIB)
  1. School of Engineering STI
  2. Institute IEM
  3.  LIB
  4.  Sparse Processes
  • Laboratory
    • Laboratory
    • Laboratory
    • People
    • Jobs and Trainees
    • News
    • Events
    • Seminars
    • Resources (intranet)
    • Twitter
  • Research
    • Research
    • Researchs
    • Research Topics
    • Talks, Tutorials, and Reviews
  • Publications
    • Publications
    • Publications
    • Database of Publications
    • Talks, Tutorials, and Reviews
    • EPFL Infoscience
  • Code
    • Code
    • Code
    • Demos
    • Download Algorithms
    • Github
  • Teaching
    • Teaching
    • Teaching
    • Courses
    • Student projects
  • Splines
    • Teaching
    • Teaching
    • Splines Tutorials
    • Splines Art Gallery
    • Wavelets Tutorials
    • Image denoising
  • Sparsity
    • Teaching
    • Teaching
    • ERC project: FUN-SP
    • Sparse Processes - Book Preview
  • Imaging
    • Teaching
    • Teaching
    • ERC project: GlobalBioIm
    • The colored revolution of bioimaging
    • Deconvolution
    • SMLM
  • Machine Learning
    • Teaching
    • Teaching
    • One-World Seminars: Representer theorems
    • A Unifying Representer Theorem

Gaussian versus Sparse Stochastic Processes: Construction, Regularity, Compressibility

J. Fageot

EPFL doctorate award, École polytechnique fédérale de Lausanne, EPFL Thesis no. 7657 (2017), 231 p., April 24, 2017.


Although this thesis contributes to the theory of random processes, it is motivated by signal processing applications, mainly the stochastic modeling of sparse signals. Specifically, we provide an in depth investigation of the innovation model, under which a signal is described as a random process s that can be linearly and deterministically transformed into a white noise. The noise represents the unpredictable part of the signal—called its innovation—and is a member of the family of Lévy white noises, which includes both Gaussian and Poisson noises. In mathematical terms, s satisfies the equation

L s = w,(1)

where L is a differential operator and w a Lévy noise. The problem is therefore to study the solutions of stochastic differential equations driven by Lévy noises. Gaussian models usually fail to reproduce the empirical sparsity observed in real-world signals. By contrast, Lévy models offer a wide range of random processes going from typically non-sparse (Gaussian) to very sparse ones (Poisson), and with many sparse signals standing between these two extremes.

Our contributions can be divided in four parts. First, the cornerstone of our work is the theory of generalized random processes. Within this framework, all the considered random processes are seen as random tempered generalized functions and can be observed through smooth and rapidly decaying windows. This allows us to define the solutions of (1), called generalized Lévy processes, in the most general setting. Then, we identify two limit phenomenons: the approximation of generalized Lévy processes by their Poisson counterparts, and the asymptotic behavior of generalized Lévy processes at coarse and fine scales. In the third part, we study the localization of Lévy noise in notorious function spaces (Hölder, Sobolev, Besov). As an application, we characterize the local smoothness and the asymptotic growth rate of the Lévy noise. Finally, we quantify the local compressibility of the generalized Lévy processes, understood as a measure of the decreasing rate of their approximation error in an appropriate basis. From this last result, we provide a theoretical justification of the ability of the innovation model (1) to represent sparse signals.

The guiding principle of our research is the duality between the local and asymptotic properties of generalized Lévy processes. In particular, we highlight the relevant quantities, called the local and asymptotic indices, that allow quantifying the local regularity, the asymptotic growth rate, the limit behavior at coarse and fine scales, and the level of compressibility of the solutions of generalized Lévy processes.

@PHDTHESIS(http://bigwww.epfl.ch/publications/fageot1703.html,
AUTHOR="Fageot, J.",
TITLE="Gaussian {\textit{versus}} Sparse Stochastic Processes:
	{C}onstruction, Regularity, Compressibility",
SCHOOL="{\'{E}}cole polytechnique f{\'{e}}d{\'{e}}rale de {L}ausanne
	({EPFL})",
YEAR="2017",
type="{EPFL} Thesis no.\ 7657 (2017), 231 p.",
address="",
month="April 24,",
note="{EPFL} doctorate award")
© 2017 Fageot. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from Fageot. This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.
  • Laboratory
  • Research
  • Publications
    • Database of Publications
    • Talks, Tutorials, and Reviews
    • EPFL Infoscience
  • Code
  • Teaching
Logo EPFL, Ecole polytechnique fédérale de Lausanne
Emergencies: +41 21 693 3000 Services and resources Contact Map Webmaster email

Follow EPFL on social media

Follow us on Facebook. Follow us on Twitter. Follow us on Instagram. Follow us on Youtube. Follow us on LinkedIn.
Accessibility Disclaimer Privacy policy

© 2023 EPFL, all rights reserved