Learning Lipschitz-Controlled Activation Functions in Neural Networks for Plug-and-Play Image Reconstruction Methods
P. Bohra, D. Perdios, A. Goujon, S. Emery, M. Unser
Proceedings of the Third Workshop on Deep Learning and Inverse Problems (NeurIPS'21), Virtual, December 13, 2021, pp. 1–9.
Ill-posed linear inverse problems are frequently encountered in image reconstruction tasks. Image reconstruction methods that combine the Plug-and-Play (PnP) priors framework with convolutional neural network (CNN) based denoisers have shown impressive performances. However, it is non-trivial to guarantee the convergence of such algorithms, which is necessary for sensitive applications such as medical imaging. It has been shown that PnP algorithms converge when deployed with a certain class of averaged denoising operators. While such averaged operators can be built from 1-Lipschitz CNNs, imposing such a constraint on CNNs usually leads to a severe drop in performance. To mitigate this effect, we propose the use of deep spline neural networks which benefit from learnable piecewise-linear spline activation functions. We introduce "slope normalization" to control the Lipschitz constant of these activation functions. We show that averaged denoising operators built from 1-Lipschitz deep spline networks consistently outperform those built from 1-Lipschitz ReLU networks.
@INPROCEEDINGS(http://bigwww.epfl.ch/publications/bohra2101.html, AUTHOR="Bohra, P. and Perdios, D. and Goujon, A. and Emery, S. and Unser, M.", TITLE="Learning {L}ipschitz-Controlled Activation Functions in Neural Networks for Plug-and-Play Image Reconstruction Methods", BOOKTITLE="Proceedings of the Third Workshop on Deep Learning and Inverse Problems ({NeurIPS'21})", YEAR="2021", editor="", volume="", series="", pages="1--9", address="Virtual", month="December 13,", organization="", publisher="", note="")