Biomedical Imaging Group
Logo EPFL
    • Splines Tutorials
    • Splines Art Gallery
    • Wavelets Tutorials
    • Image denoising
    • ERC project: FUN-SP
    • Sparse Processes - Book Preview
    • ERC project: GlobalBioIm
    • The colored revolution of bioimaging
    • Deconvolution
    • SMLM
    • One-World Seminars: Representer theorems
    • A Unifying Representer Theorem
Follow us on Twitter.
Join our Github.
Masquer le formulaire de recherche
Menu
BIOMEDICAL IMAGING GROUP (BIG)
Laboratoire d'imagerie biomédicale (LIB)
  1. School of Engineering STI
  2. Institute IEM
  3.  LIB
  4.  Ridge Splines
  • Laboratory
    • Laboratory
    • Laboratory
    • People
    • Jobs and Trainees
    • News
    • Events
    • Seminars
    • Resources (intranet)
    • Twitter
  • Research
    • Research
    • Researchs
    • Research Topics
    • Talks, Tutorials, and Reviews
  • Publications
    • Publications
    • Publications
    • Database of Publications
    • Talks, Tutorials, and Reviews
    • EPFL Infoscience
  • Code
    • Code
    • Code
    • Demos
    • Download Algorithms
    • Github
  • Teaching
    • Teaching
    • Teaching
    • Courses
    • Student projects
  • Splines
    • Teaching
    • Teaching
    • Splines Tutorials
    • Splines Art Gallery
    • Wavelets Tutorials
    • Image denoising
  • Sparsity
    • Teaching
    • Teaching
    • ERC project: FUN-SP
    • Sparse Processes - Book Preview
  • Imaging
    • Teaching
    • Teaching
    • ERC project: GlobalBioIm
    • The colored revolution of bioimaging
    • Deconvolution
    • SMLM
  • Machine Learning
    • Teaching
    • Teaching
    • One-World Seminars: Representer theorems
    • A Unifying Representer Theorem

Neural Networks and Minimum-Norm Ridge Splines

M. Unser

Keynote address, Proceedings of the HCM Workshop: Synergies Between Data Sciences and PDE Analysis (HCM'22), Bonn, Federal Republic of Germany, June 13-17, 2022, pp. 1.


A powerful framework for supervised learning is the minimization of a cost that consists of a data fidelity term plus a regularization functional. In this talk, I investigate a Radon-domain regularization functional that depends on a generic operator L. The proposed formulation yields a solution that takes the form of a two layer neural network with an activation function that is determined by the regularization operator. In particular, one retrieves the popular ReLU networks by taking L to be the Laplacian. The proposed setting offers guarantees of universal approximation for a broad family of regularization operators or, equivalently, for a wide variety of shallow neural networks including cases (such as ReLU) where the activation function is increasing polynomially. It also explains the favorable role of bias and skip connections in neural architectures.

@INPROCEEDINGS(http://bigwww.epfl.ch/publications/unser2202.html,
AUTHOR="Unser, M.",
TITLE="Neural Networks and Minimum-Norm Ridge Splines",
BOOKTITLE="Proceedings of the {HCM} Workshop: {S}ynergies Between Data
	Sciences and {PDE} Analysis ({HCM'22})",
YEAR="2022",
editor="",
volume="",
series="",
pages="1",
address="Bonn, Federal Republic of Germany",
month="June 13-17,",
organization="",
publisher="",
note="Keynote address")
© 2022 HCM. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from HCM. This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.
  • Laboratory
  • Research
  • Publications
    • Database of Publications
    • Talks, Tutorials, and Reviews
    • EPFL Infoscience
  • Code
  • Teaching
Logo EPFL, Ecole polytechnique fédérale de Lausanne
Emergencies: +41 21 693 3000 Services and resources Contact Map Webmaster email

Follow EPFL on social media

Follow us on Facebook. Follow us on Twitter. Follow us on Instagram. Follow us on Youtube. Follow us on LinkedIn.
Accessibility Disclaimer Privacy policy

© 2025 EPFL, all rights reserved