Biomedical Imaging Group
Logo EPFL
    • Splines Tutorials
    • Splines Art Gallery
    • Wavelets Tutorials
    • Image denoising
    • ERC project: FUN-SP
    • Sparse Processes - Book Preview
    • ERC project: GlobalBioIm
    • The colored revolution of bioimaging
    • Deconvolution
    • SMLM
    • One-World Seminars: Representer theorems
    • A Unifying Representer Theorem
Follow us on Twitter.
Join our Github.
Masquer le formulaire de recherche
Menu
BIOMEDICAL IMAGING GROUP (BIG)
Laboratoire d'imagerie biomédicale (LIB)
  1. School of Engineering STI
  2. Institute IEM
  3.  LIB
  4.  Functional Optimization
  • Laboratory
    • Laboratory
    • Laboratory
    • People
    • Jobs and Trainees
    • News
    • Events
    • Seminars
    • Resources (intranet)
    • Twitter
  • Research
    • Research
    • Researchs
    • Research Topics
    • Talks, Tutorials, and Reviews
  • Publications
    • Publications
    • Publications
    • Database of Publications
    • Talks, Tutorials, and Reviews
    • EPFL Infoscience
  • Code
    • Code
    • Code
    • Demos
    • Download Algorithms
    • Github
  • Teaching
    • Teaching
    • Teaching
    • Courses
    • Student projects
  • Splines
    • Teaching
    • Teaching
    • Splines Tutorials
    • Splines Art Gallery
    • Wavelets Tutorials
    • Image denoising
  • Sparsity
    • Teaching
    • Teaching
    • ERC project: FUN-SP
    • Sparse Processes - Book Preview
  • Imaging
    • Teaching
    • Teaching
    • ERC project: GlobalBioIm
    • The colored revolution of bioimaging
    • Deconvolution
    • SMLM
  • Machine Learning
    • Teaching
    • Teaching
    • One-World Seminars: Representer theorems
    • A Unifying Representer Theorem

Functional Optimization Methods for Machine Learning

M. Unser

Summer School on Mathematics and Machine Learning for Image Analysis 2024 (MMLIA'24), Bologna, Italian Republic, June 4-12, 2024.


In this mini-course, we show how various forms of supervised learning can be recast as optimization problems over suitable function spaces, subject to regularity constraints. Our family of regularization functionals has two components:

  1. a regularization operator, which can be composed with an optional projection mechanism (Radon transform), and
  2. a (semi-)norm, which may be Hilbertian (RKHS) or sparsity-promoting (total variation).

By invoking an abstract representer theorem, we obtain an explicit parametrization of the extremal points of the solution set. The latter translates into a concrete neuronal architecture and training procedure. We demonstrate the use of this variational formalism on a variety of examples, including several variants of spline-based regression. We also draw connections with classical kernel-based techniques and modern ReLU neural networks. Finally, we show how our framework is applicable to the learning of non-linearities in deep and not-so-deep networks.

Bibliography

  1. M. Unser, "A Unifying Representer Theorem for Inverse Problems and Machine Learning," Foundations of Computational Mathematics, vol. 21, no. 4, pp. 941–960, August, 2021.

  2. M. Unser, "From Kernel Methods to Neural Networks: A Unifying Variational Formulation," Foundations of Computational Mathematics, in press.

@INPROCEEDINGS(http://bigwww.epfl.ch/publications/unser2402.html,
AUTHOR="Unser, M.",
TITLE="Functional Optimization Methods for Machine Learning",
BOOKTITLE="Summer School on Mathematics and Machine Learning for Image
	Analysis 2024 ({MMLIA'24})",
YEAR="2024",
editor="",
volume="",
series="",
pages="",
address="Bologna, Italian Republic",
month="June 4-12,",
organization="",
publisher="",
note="")
© 2024 . Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from . This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.
  • Laboratory
  • Research
  • Publications
    • Database of Publications
    • Talks, Tutorials, and Reviews
    • EPFL Infoscience
  • Code
  • Teaching
Logo EPFL, Ecole polytechnique fédérale de Lausanne
Emergencies: +41 21 693 3000 Services and resources Contact Map Webmaster email

Follow EPFL on social media

Follow us on Facebook. Follow us on Twitter. Follow us on Instagram. Follow us on Youtube. Follow us on LinkedIn.
Accessibility Disclaimer Privacy policy

© 2025 EPFL, all rights reserved