Biomedical Imaging Group

Seminars

CONTENTS |

Seminars |

Adversarially-Sandwiched VAEs for Inverse Problems02 Oct 2018

One of the main challenges of inverse problems is modelling (or learning) the data prior. Recently, neural-network-based generative modelling have shown impressive ability to model (or estimate) this data distribution. These methods use latent-variable-based parametrisation of the estimated distribution which is useful for real-world signals. In this talk we will first briefly discuss the two pillars of generative modelling: Generative Adversarial Netwoks (GANs) and Variational Autoencoder (VAE). In GANs, a generator is used to generate samples from latent variables and a discriminator is trained to differentiate these generated fake samples from the real samples. Meanwhile, the generator is trained to produce real looking samples so as to fool the discriminator. This method is equivalent to minimising the Jensen-Shanon Divergence (JSD) between the actual and the estimated distribution. However, GANs have many problems: they are hard to train, they lack encoding architecture to produce latent representation of the data, and more importantly they do not explicitly give the estimated likelihood of the data. VAEs are encoder-decoder networks which are much easier to train and which explicitly estimate a lower bound on the likelihood. They are trained by maximising the lower bound of the estimated log-likelihood of the data. This is equivalent to minimising the Kullback-Leibler Divergence (KLD) between the actual and the estimated distribution. However, KLD, unlike JSD, is an unsymmetric type of divergence and may result in inferior results. Finally, I will propose a new scheme to train the VAEs, in which an upper and a lower bound of the log-likelihood are used to sandwich it. Then for a given sample from the decoder, a discriminator (or adversary) is used to decide if the estimated likelihood of the sample is higher or lower than the actual likelihood. In case of former, an upper bound of the likelihood is minimised and in case of latter a lower bound is maximized. We show that this scheme, like GANs, is equivalent to minimizing an upper bound on the JSD between the actual and the estimated distribution and reaches the global minima iff both are equal.

PSF-Extractor: from fluorescent beads measurements to continuous PSF model11 Sep 2018

Analysis of Planar Shapes through Shape Dictionary Learning with an Extension to Splines28 Aug 2018

Complex-order scale-invariant operators and self-similar processes21 Aug 2018

Variational Framework for Continuous Angular Refinement and Reconstruction in Cryo-EM14 Aug 2018

Looking beyond Pixels: Theory, Algorithms and Applications of Continuous Sparse Recovery07 Aug 2018

A L1 representer theorem of vector-valued learning17 Jul 2018

Computational Super-Sectioning for Single-Slice Structured-Illumination Microscopy 19 Jun 2018

Theoretical and Numerical Analysis of Super-Resolution without Grid19 Jun 2018

Fast rotational dictionary learning using steerability 08 May 2018

Hybrid spline dictionaries for continuous-domain inverse problems24 Apr 2018

Fast Multiresolution Reconstruction for Cryo-EM17 Apr 2018

Direct Reconstruction of Clipped Peaks in Bandlimited OFDM Signals13 Mar 2018

Sparsity-based techniques for diffraction tomography27 Feb 2018

Structured Illumination and the Analysis of Single Molecules in Cells09 Feb 2018

Periodic Splines and Gaussian Processes for the Resolution of Linear Inverse Problems30 Jan 2018

Fast Piecewise-Affine Motion Estimation Without Segmentation19 Dec 2017

Continuous Representations in Bioimage Analysis: a Bridge from Pixels to the Real World12 Dec 2017

Steer&Detect on Images 14 Nov 2017

Fundamental computational barriers in inverse problems and the mathematics of information27 Oct 2017

Variational use of B-splines and Kernel Based Functions27 Oct 2017

Deep learning based data manifold projection - a new regularization for inverse problems17 Oct 2017

GlobalBioIm Lib - v2: new tools, more flexibility, and improved composition rules.03 Oct 2017

Fractional Integral transforms and Time-Frequency Representations02 Jun 2017

First steps toward fast PET reconstruction30 May 2017

Lipid membranes and surface reconstruction - a biologically inspired method for 3D segmentation16 May 2017

Optical Diffraction Tomography: Principles and Algorithms09 May 2017

Compressed Sensing for Dose Reduction in STEM Tomography11 Apr 2017

Chasing Mycobacteria10 Apr 2017