Parseval Convolution Neural Networks with Application to Imaging
M. Unser
Summer School on Applied Harmonic Analysis and Machine Learning 2024 (AHAML'24), Genova, Italian Republic, September 2-6, 2024.
One of the reasons why neural networks can occasionally hallucinate is their lack stability. Our proposed remedy is to constrain the Lipschitz constant of each layer to be non-expansive; i.e., with a Lipschitz constant no greater than 1. We can further constrain the linear layers to be energy-preserving. This motivates us to thoroughly characterize the class of Parseval convolution operators. We then present a constructive approach for the design/specification of such filterbanks via the chaining of elementary Parseval modules, each parameterized by an orthogonal matrix or a 1-tight frame. We demonstrate the usage of those tools with the design of a CNN-based algorithm for the iterative reconstruction of biomedical images. Our algorithm falls within the plug-and-play framework for the resolution of inverse problems. It yields better-quality results than the sparsity-based methods used in compressed sensing, while offering essentially the same convergence and robustness guarantees.
@INPROCEEDINGS(http://bigwww.epfl.ch/publications/unser2401.html,
AUTHOR="Unser, M.",
TITLE="Parseval Convolution Neural Networks with Application to
Imaging",
BOOKTITLE="Summer School on Applied Harmonic Analysis and Machine
Learning 2024 ({AHAML'24})",
YEAR="2024",
editor="",
volume="",
series="",
pages="",
address="Genova, Italian Republic",
month="September 2-6,",
organization="",
publisher="",
note="")