Biomedical Imaging Group
Logo EPFL
    • Splines Tutorials
    • Splines Art Gallery
    • Wavelets Tutorials
    • Image denoising
    • ERC project: FUN-SP
    • Sparse Processes - Book Preview
    • ERC project: GlobalBioIm
    • The colored revolution of bioimaging
    • Deconvolution
    • SMLM
    • One-World Seminars: Representer theorems
    • A Unifying Representer Theorem
Follow us on Twitter.
Join our Github.
Masquer le formulaire de recherche
Menu
BIOMEDICAL IMAGING GROUP (BIG)
Laboratoire d'imagerie biomédicale (LIB)
  1. School of Engineering STI
  2. Institute IEM
  3.  LIB
  4.  1-Lipschitz Analysis
  • Laboratory
    • Laboratory
    • Laboratory
    • People
    • Jobs and Trainees
    • News
    • Events
    • Seminars
    • Resources (intranet)
    • Twitter
  • Research
    • Research
    • Researchs
    • Research Topics
    • Talks, Tutorials, and Reviews
  • Publications
    • Publications
    • Publications
    • Database of Publications
    • Talks, Tutorials, and Reviews
    • EPFL Infoscience
  • Code
    • Code
    • Code
    • Demos
    • Download Algorithms
    • Github
  • Teaching
    • Teaching
    • Teaching
    • Courses
    • Student projects
  • Splines
    • Teaching
    • Teaching
    • Splines Tutorials
    • Splines Art Gallery
    • Wavelets Tutorials
    • Image denoising
  • Sparsity
    • Teaching
    • Teaching
    • ERC project: FUN-SP
    • Sparse Processes - Book Preview
  • Imaging
    • Teaching
    • Teaching
    • ERC project: GlobalBioIm
    • The colored revolution of bioimaging
    • Deconvolution
    • SMLM
  • Machine Learning
    • Teaching
    • Teaching
    • One-World Seminars: Representer theorems
    • A Unifying Representer Theorem

Analysis of 1-Lipschitz Neural Networks

S. Neumayer, P. Bohra, S. Ducotterd, A. Goujon, D. Perdios, M. Unser

Proceedings of the 2022 Oberwolfach Workshop on Mathematical Imaging and Surface Processing (OWMISP'22), Oberwolfach, Federal Republic of Germany, August 21-27, 2022, vol. 2022/38, pp. 2257–2259.


The topics covered in this talk are related to the recent preprint [1]. Lipschitz constrained neural networks have several advantages compared to unconstrained ones and can be applied to various different problems. Consequently, they have recently attracted considerable attention in the deep learning community. Since designing and training expressive Lipschitz-constrained networks is very challenging, there is a need for improved methods and a better theoretical understanding. As the general case is very demanding, we restrict our attention to feed-forward neural networks with 1-Lipschitz component-wise activation functions and weight matrices with p-norm less or equal than one. This indeed leads to 1-Lipschitz neural networks, for which naturally the question of expressiveness arises. Unfortunately, it turns out that networks with ReLU activation functions have provable disadvantages in this setting. Firstly, they cannot represent even simple piece-wise linear functions such as the hat function. Secondly, there exists a whole class of relatively simple functions that cannot be approximated in terms of the uniform norm on bounded boxes. To show this fact, we can make use of the second-order total variation and the fact that ReLU networks can only produce functions with bounded second-order total variation.

References

  1. S. Neumayer, A. Goujon, P. Bohra, M. Unser, "Approximation of Lipschitz Functions using Deep Spline Neural Networks," arXiv:2204.06233 [cs.LG]

@INPROCEEDINGS(http://bigwww.epfl.ch/publications/neumayer2202.html,
AUTHOR="Neumayer, S. and Bohra, P. and Ducotterd, S. and Goujon, A. and
	Perdios, D. and Unser, M.",
TITLE="Analysis of 1-{L}ipschitz Neural Networks",
BOOKTITLE="Proceedings of the 2022 {O}berwolfach Workshop on
	Mathematical Imaging and Surface Processing ({OWMISP'22})",
YEAR="2022",
editor="",
volume="2022/38",
series="",
pages="2257--2259",
address="Oberwolfach, Federal Republic of Germany",
month="August 21-27,",
organization="",
publisher="",
note="")
© 2022 MFO. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from MFO. This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.
  • Laboratory
  • Research
  • Publications
    • Database of Publications
    • Talks, Tutorials, and Reviews
    • EPFL Infoscience
  • Code
  • Teaching
Logo EPFL, Ecole polytechnique fédérale de Lausanne
Emergencies: +41 21 693 3000 Services and resources Contact Map Webmaster email

Follow EPFL on social media

Follow us on Facebook. Follow us on Twitter. Follow us on Instagram. Follow us on Youtube. Follow us on LinkedIn.
Accessibility Disclaimer Privacy policy

© 2025 EPFL, all rights reserved