Graphic STI
logo EPFL
text EPFL
english only
Biomedical Imaging Group
BIG > Tutorials, Reviews and Recent Talks
CONTENTS

Home page

News & Events

Seminars

People

Research

Publications

Tutorials & Reviews

Recent Talks

Demos

Download Algorithms

Jobs and Trainees

Teaching

Student Projects

Intranet

 

DOCUMENTS

Tutorials

Reviews

Recent Talks

Splines

Wavelets

Sparsity

Sampling

Bio-imaging

All

Computational Bioimaging - How to futher reduce exposure and/or increase image quality

M. Unser


Plenary talk,Int. Conf. of the IEEE EMBS (EMBC'17), July 11-15, 2017, Jeju Island, Korea.

We start our account of inverse problems in imaging with a brief review of first-generation reconstruction algorithms, which are linear and typically non-iterative (e.g., backprojection). We then highlight the emergence of the concept of sparsity, which opened the door to the resolution of more difficult image reconstruction problems, including compressed sensing. In particular, we demonstrate the global optimality of splines for solving problems with total-variation (TV) regularization constraints. Next, we introduce an alternative statistical formulation where signals are modeled as sparse stochastic processes. This allows us to establish a formal equivalence between non-Gaussian MAP estimation and sparsity-promoting techniques that are based on the minimization of a non-quadratic cost functional. We also show how to compute the solution efficiently via an alternating sequence of linear steps and pointwise nonlinearities (ADMM algorithm). This concludes our discussion of the second-generation methods that constitute the state-of-the-art in a variety of modalities.

In the final part of the presentation, we shall argue that learning techniques will play a central role in the future development of the field with the emergence of third-generation methods. A natural solution for improving image quality is to retain the linear part of the ADMM algorithm while optimizing its non-linear step (proximal operator) so as to minimize the reconstruction error. Another more extreme scenario is to replace the iterative part of the reconstruction by a deep convolutional network. The various approaches will be illustrated with the reconstruction of images in a variety of modalities including MRI, X-ray and cryo-electron tomography, and deconvolution microscopy.

Biomedical Image Reconstruction

M. Unser

full size

12th European Molecular Imaging Meeting, 5-7 April 2017, Cologne, Germany.

A fundamental component of the imaging pipeline is the reconstruction algorithm. In this educational session, we review the physical and mathematical principles that underlie the design of such algorithms. We argue that the concepts are fairly universal and applicable to a majority of (bio)medical imaging modalities, including magnetic resonance imaging and fMRI, x-ray computer tomography, and positron-emission tomography (PET). Interestingly, the paradigm remains valid for modern cellular/molecular imaging with confocal/super-resolution fluorescence microscopy, which is highly relevant to molecular imaging as well. In fact, we believe that the huge potential for cross-fertilization and mutual re-enforcement between imaging modalities has not been fully exploited yet.

The prerequisite to image reconstruction is an accurate physical description of the image-formation process: the so-called forward model, which is assumed to be linear. Numerically, this translates into the specification of a system matrix, while the reconstruction of images conceptually boils down to a stable inversion of this matrix. The difficulty is essentially twofold: (i) the system matrix is usually much too large to be stored/inverted directly, and (ii) the problem is inherently ill-posed due to the presence of noise and/or bad conditioning of the system.

Our starting point is an overview of the modalities in relation to their forward model. We then discuss the classical linear reconstruction methods that typically involve some form of backpropagation (CT or PET) and/or the fast Fourier transform (in the case of MRI). We present stabilized variants of these methods that rely on (Tikhonov) regularization or the injection of prior statistical knowledge under the Gaussian hypothesis. Next, we review modern iterative schemes that can handle challenging acquisition setups such as parallel MRI, non-Cartesian sampling grids, and/or missing views. In particular, we discuss sparsity-promoting methods that are supported by the theory of compressed sensing. We show how to implement such schemes efficiently using simple combinations of linear solvers and thresholding operations. The main advantage of these recent algorithms is that they improve the quality of the image reconstruction. Alternatively, they allow a substantial reduction of the radiation dose and/or acquisition time without noticeable degradation in quality. This behavior is illustrated practically.

In the final part of the tutorial, we discuss the current challenges and directions of research in the field; in particular, the necessity of dealing with large data sets in multiple dimensions: 2D or 3D space combined with time (in the case of dynamic imaging) and/or multispectral/multimodal information.

Challenges and Opportunities in Biological Imaging

M. Unser, Professor, Ecole Polytechnique Fédérale de Lausanne, Biomedical Imaging Group

full size

Plenary. IEEE International Conference on Image Processing (ICIP), 27-30 September, 2015, Québec City, Canada.

While the major achievements in medical imaging can be traced back to the end the 20th century, there are strong indicators that we have recently entered the golden age of cellular/biological imaging. The enabling modality is fluorescence microscopy which results from the combination of highly specific fluorescent probes (Nobel Prize 2008) and sophisticated optical instrumentation (Nobel Prize 2014). This has led to the emergence of modern microscopy centers that are providing biologists with unprecedented amounts of data in 3D + time.

To address the computational aspects, two nascent fields have emerged in which image processing is expected to play a significant role. The first is "digital optics" where the idea is to combine optics with advanced signal processing in order to increase spatial resolution while reducing acquisition time. The second area is "bioimage informatics" which is concerned with the development of image analysis software to make microscopy more quantitative. The key issue here is reliable image segmentation as well as the ability to track structures of interest over time. We shall discuss specific examples and describe state-of-the-art solutions for bioimage reconstruction and analysis. This will help us build a list of challenges and opportunities to guide further research in bioimaging.

Sparse stochastic processes: A statistical framework for compressed sensing and biomedical image reconstruction

M. Unser

full size

4 hours tutorial, Inverse Problems and Imaging Conference, Institut Henri Poincaré, Paris, April 7-11, 2014.

We introduce an extended family of continuous-domain sparse processes that are specified by a generic (non-Gaussian) innovation model or, equivalently, as solutions of linear stochastic differential equations driven by white Lévy noise. We present the functional tools for their characterization. We show that their transform-domain probability distributions are infinitely divisible, which induces two distinct types of behavior‐Gaussian vs. sparse‐at the exclusion of any other. This is the key to proving that the non-Gaussian members of the family admit a sparse representation in a matched wavelet basis.

Next, we apply our continuous-domain characterization of the signal to the discretization of ill-conditioned linear inverse problems where both the statistical and physical measurement models are projected onto a linear reconstruction space. This leads the derivation of a general class of maximum a posteriori (MAP) signal estimators. While the formulation is compatible with the standard methods of Tikhonov and l1-type regularizations, which both appear as particular cases, it open the door to a much broader class of sparsity-promoting regularization schemes that are typically nonconvex. We illustrate the concept with the derivation of algorithms for the reconstruction of biomedical images (deconvolution microscopy, MRI, X-ray tomography) from noisy and/or incomplete data. The proposed framework also suggests alternative Bayesian recovery procedures that minimize t he estimation error.

Reference

The Colored Revolution of Bioimaging

C. Vonesch, F. Aguet, J.-L. Vonesch, M. Unser

slide

IEEE Signal Processing Magazine, vol. 23, no. 3, pp. 20-31, May 2006.

With the recent development of fluorescent probes and new high-resolution microscopes, biological imaging has entered a new era and is presently having a profound impact on the way research is being conducted in the life sciences. Biologists have come to depend more and more on imaging. They can now visualize subcellular components and processes in vivo, both structurally and functionally. Observations can be made in two or three dimensions, at different wavelengths (spectroscopy), possibly with time-lapse imaging to investigate cellular dynamics.

The observation of many biological processes relies on the ability to identify and locate specific proteins within their cellular environment. Cells are mostly transparent in their natural state and the immense number of molecules that constitute them are optically indistinguishable from one another. This makes the identification of a particular protein a very complex task—akin to finding a needle in a haystack.

Full story ...

Wavelets, sparsity and biomedical image reconstruction

M. Unser

slides

Imaging Seminar, University of Bern, Inselspital November 13, 2012.

Our purpose in this talk is to advocate the use of wavelets for advanced biomedical imaging. We start with a short tutorial on wavelet bases, emphasizing the fact that they provide a sparse representation of images. We then discuss a simple, but remarkably effective, image-denoising procedure that essentially amounts to discarding small wavelet coefficients (soft-thresholding). The crucial observation is that this type of “sparsity-promoting” algorithm is the solution of a l1-norm minimization problem. The underlying principle of wavelet regularization is a powerful concept that has been used advantageously for compressed sensing and for reconstructing images from limited and/or noisy measurements. We illustrate the point by presenting wavelet-based algorithms for 3D deconvolution microscopy, and MRI reconstruction (with multiple coils and/or non-Cartesian k-space sampling). These methods were developed at the EPFL in collaboration with imaging scientists and are, for the most part, providing state-of-the-art performance.

Recent Advances in Biomedical Imaging and Signal Analysis

M. Unser

slide

Proceedings of the Eighteenth European Signal Processing Conference (EUSIPCO'10), Ålborg, Denmark, August 23-27, 2010, EURASIP Fellow inaugural lecture.

Wavelets have the remarkable property of providing sparse representations of a wide variety of "natural" images. They have been applied successfully to biomedical image analysis and processing since the early 1990s.

In the first part of this talk, we explain how one can exploit the sparsifying property of wavelets to design more effective algorithms for image denoising and reconstruction, both in terms of quality and computational performance. This is achieved within a variational framework by imposing some ℓ1-type regularization in the wavelet domain, which favors sparse solutions. We discuss some corresponding iterative skrinkage-thresholding algorithms (ISTA) for sparse signal recovery and introduce a multi-level variant for greater computational efficiency. We illustrate the method with two concrete imaging examples: the deconvolution of 3-D fluorescence micrographs, and the reconstruction of magnetic resonance images from arbitrary (non-uniform) k-space trajectories.

In the second part, we show how to design new wavelet bases that are better matched to the directional characteristics of images. We introduce a general operator-based framework for the construction of steerable wavelets in any number of dimensions. This approach gives access to a broad class of steerable wavelets that are self-reversible and linearly parameterized by a matrix of shaping coefficients; it extends upon Simoncelli's steerable pyramid by providing much greater wavelet diversity. The basic version of the transform (higher-order Riesz wavelets) extracts the partial derivatives of order N of the signal (e.g., gradient or Hessian). We also introduce a signal-adapted design, which yields a PCA-like tight wavelet frame. We illustrate the capabilities of these new steerable wavelets for image analysis and processing (denoising).

Sampling and Interpolation for Biomedical Imaging

M. Unser

full size
Part I   Part II

2006 IEEE International Symposium on Biomedical Imaging, April 6-9, 2006, Arlington, Virginia, USA.

This tutorial will explain the modern, Hilbert-space approach for the discretization (sampling) and reconstruction (interpolation) of images (in two or higher dimensions). The emphasis will be on quality and optimality, which are important considerations for biomedical applications.
The main point in the modern formulation is that the signal model need not be bandlimited. In fact, it makes much better sense computationally to consider spline or wavelet-like representations that involve much shorter (e.g. compactly supported) basis functions that are shifted replicates of a single prototype (e.g., B-spline). We will show how Shannon's standard sampling paradigm can be adapted for dealing with such representations. In essence, this boils down to modifying the classical "anti-aliasing" prefilter so that it is optimally matched to the representation space (in practice, this can be accomplished by suitable digital post-filtering). We will also discuss efficient digital-filter-based solutions for high-quality image interpolation. Another important issue will be the assessment of interpolation quality and the identification of basis functions (and interpolators) that offer the best performance for a given computational budget. These concepts will be illustrated with various applications in biomedical imaging: tomographic reconstruction, 3D slicing and re-formatting, estimation of image differentials for feature extraction, and image registration (both rigid-body and elastic).

Image Processing with ImageJ

M. Abramoff, P. Magalhães, S. Ram

full size

Biophotonics International, vol. 11, no. 7, pp. 36-42, July 2004.

As the popularity of the ImageJ open-source, Java-based imaging program grows, its capabilities increase, too. It is now being used for imaging applications ranging from skin analysis to neuroscience.

A Review of Wavelets in Biomedical Applications

M. Unser, A. Aldroubi

Proceedings of the IEEE, vol. 84, no. 4, pp. 626-638, April 1996.

In this paper, we present an overview of the various uses of the wavelet transform (WT) in medicine and biology. We start by describing the wavelet properties that are the most important for biomedical applications. In particular, we provide an interpretation of the continuous WT as a prewhitening multi-scale matched filter. We also briefly indicate the analogy between the WT and some of the biological processing that occurs in the early components of the auditory and visual system. We then review the uses of the WT for the analysis of one-dimensional physiological signals obtained by phonocardiography, electrocardiography (ECG), and electroencephalography (EEG), including evoked response potentials. Next, we provide a survey of recent wavelet developments in medical imaging. These include biomedical image processing algorithms (e.g., noise reduction, image enhancement, and detection of microcalcifications in mammograms); image reconstruction and acquisition schemes (tomography, and magnetic resonance imaging (MRI)); and multiresolution methods for the registration and statistical analysis of functional images of the brain (positron emission tomography (PET), and functional MRI). In each case, we provide the reader with some general background information and a brief explanation of how the methods work. The paper also includes an extensive bibliography.

Wavelets in Medicine and Biology

A. Aldroubi, M.A. Unser, Eds.

full size

ISBN 0-8493-9483-X, CRC Press, Boca Raton FL, USA, 1996, 616 p.

For the first time, the field's leading international experts have come together to produce a complete guide to wavelet transform applications in medicine and biology. This book provides guidelines for all those interested in learning about waveletes and their applications to biomedical problems.

The introductory material is written for non-experts and includes basic discussions of the theoretical and practical foundations of wavelet methods. This is followed by contributions from the most prominent researchers in the field, giving the reader a complete survey of the use of wavelets in biomedical engineering.

The book consists of four main sections:

  • Wavelet Transform: Theory and Implementation
  • Wavelets in Medical Imaging and Tomography
  • Wavelets and Biomedical Signal Processing
  • Wavelets and Mathematical Models in Biology
  • BibTeX reference
  • Full review of this book
    Akram Aldroubi and Michael Unser, Eds., Wavelets in Medicine and Biology, CRC Press, Boca Raton, FL, 1996.
    A. Bultheel
    Journal of Approximation Theory, vol. 90, no. 3, pp. 458-459, September 1997.
    Wavelets have built a strong reputation in the context of signal and image processing. The editors of this book have invited several specialists to contribute a chapter illustrating this in the (bio)medical and biological sciences.

© 2017 EPFL • webmaster.big@epfl.ch • 20.07.2017