Improving Lipschitz-Constrained Neural Networks by Learning Activation Functions
S. Ducotterd, A. Goujon, P. Bohra, D. Perdios, S. Neumayer, M. Unser
Journal of Machine Learning Research, vol. 25, no. 65, pp. 1–30, 2024.
Lipschitz-constrained neural networks have several advantages over unconstrained ones and can be applied to a variety of problems, making them a topic of attention in the deep learning community. Unfortunately, it has been shown both theoretically and empirically that they perform poorly when equipped with ReLU activation functions. By contrast, neural networks with learnable 1-Lipschitz linear splines are known to be more expressive. In this paper, we show that such networks correspond to global optima of a constrained functional optimization problem that consists of the training of a neural network composed of 1-Lipschitz linear layers and 1-Lipschitz freeform activation functions with second-order total-variation regularization. Further, we propose an efficient method to train these neural networks. Our numerical experiments show that our trained networks compare favorably with existing 1-Lipschitz neural architectures.
@ARTICLE(http://bigwww.epfl.ch/publications/ducotterd2401.html, AUTHOR="Ducotterd, S. and Goujon, A. and Bohra, P. and Perdios, D. and Neumayer, S. and Unser, M.", TITLE="Improving {L}ipschitz-Constrained Neural Networks by Learning Activation Functions", JOURNAL="Journal of Machine Learning Research", YEAR="2024", volume="25", number="65", pages="1--30", month="", note="")