In this talk, I will present CNN-based projection on the data manifold as a new regularization scheme. Classical and iterative algorithms regularize the inverse problems by imposing prior on the real world data which is not true (like smoothness by $\ell_2$-norm and sharp edges by TV-norm). This results in solutions which are increasingly far from the true solutions as the ill-posedness of the problem increases. CNNs trained as high-dimensional (image-to-image) regressors have recently been used to efficiently solve inverse problems in imaging. However, these approaches not only lack any regularization but also are not able to enforce data fidelity and therefore, are unreliable in terms of the prior they impose and the measurement consistency of their solution. I will show that our scheme is built on the framework of Projected Gradient Descent (PGD) where the projector is replaced by the trained CNN. The gradient descent enforces the data fidelity, while the CNN recursively projects the solution closer to the space of desired reconstruction images. Since the projector is replaced with a CNN, I will present a relaxed PGD, which always converges. I will discuss a simple scheme to train a CNN to act like a projector. Finally, I will present experiments on sparse view Computed Tomography (CT) reconstruction for both noiseless and noisy measurements which show an improvement over the total-variation (TV) method and a recent CNN based technique.
First steps toward fast PET reconstruction30 May 2017
Chasing Mycobacteria10 Apr 2017
SIGGRAPH ASIA 201601 Nov 2016
Lévy's Persian summers18 Oct 2016
Algorithmic Aspects of Compressive Sensing03 Oct 2016
ICIP 201620 Sep 2016
Machine Vision forum in Heidelberg17 Aug 2016