Biomedical Imaging Group
Logo EPFL
    • Splines Tutorials
    • Splines Art Gallery
    • Wavelets Tutorials
    • Image denoising
    • ERC project: FUN-SP
    • Sparse Processes - Book Preview
    • ERC project: GlobalBioIm
    • The colored revolution of bioimaging
    • Deconvolution
    • SMLM
    • One-World Seminars: Representer theorems
    • A Unifying Representer Theorem
Follow us on Twitter.
Join our Github.
Masquer le formulaire de recherche
Menu
BIOMEDICAL IMAGING GROUP (BIG)
Laboratoire d'imagerie biomédicale (LIB)
  1. School of Engineering STI
  2. Institute IEM
  3.  LIB
  4.  Seminars
  • Laboratory
    • Laboratory
    • Laboratory
    • People
    • Jobs and Trainees
    • News
    • Events
    • Seminars
    • Resources (intranet)
    • Twitter
  • Research
    • Research
    • Researchs
    • Research Topics
    • Talks, Tutorials, and Reviews
  • Publications
    • Publications
    • Publications
    • Database of Publications
    • Talks, Tutorials, and Reviews
    • EPFL Infoscience
  • Code
    • Code
    • Code
    • Demos
    • Download Algorithms
    • Github
  • Teaching
    • Teaching
    • Teaching
    • Courses
    • Student projects
  • Splines
    • Teaching
    • Teaching
    • Splines Tutorials
    • Splines Art Gallery
    • Wavelets Tutorials
    • Image denoising
  • Sparsity
    • Teaching
    • Teaching
    • ERC project: FUN-SP
    • Sparse Processes - Book Preview
  • Imaging
    • Teaching
    • Teaching
    • ERC project: GlobalBioIm
    • The colored revolution of bioimaging
    • Deconvolution
    • SMLM
  • Machine Learning
    • Teaching
    • Teaching
    • One-World Seminars: Representer theorems
    • A Unifying Representer Theorem

Seminars


Seminar 00273.txt

Fundamental computational barriers in inverse problems and the mathematics of information
Alexander Bastounis, Cambridge University

Seminar • 27 October 2017

Abstract
Two of the most influential recent developments in applied mathematics are neural networks and compressed sensing. Compressed sensing (e.g. via basis pursuit or lasso) has seen considerable success at solving inverse problems and neural networks are rapidly becoming commonplace in everyday life with use cases ranging from self driving cars to automated music production. The observed success of these approaches would suggest that solving the underlying mathematical model on a computer is both well understood and computationally efficient. We will demonstrate that this is not the case. Instead, we show the following paradox: it is impossible to design algorithms that solve these problems to one significant figure when given inaccurate input data, even when the inaccuracies can be made arbitrarily small. This will occur even when the input data is in many senses well conditioned and shows that every existing algorithm will fail on some simple inputs. Further analysis of the situation for neural networks leads to the following additional ‘paradoxes of deep learning’: (1) One cannot guarantee the existence of algorithms for accurately training the neural network, and (2) one can have 100% success rate on arbitrarily many test cases, yet uncountably many misclassifications on elements that are arbitrarily close to the training set. Explaining the apparent contradiction of the observed success when applying compressed sensing, lasso and neural networks to real world examples given the aforementioned non existence result will require the development of new mathematical ideas and tools. We shall explain some of these ideas and give further information on all of the above paradoxes during the talk.
  • Laboratory
    • People
    • Jobs and Trainees
    • News
    • Events
    • Seminars
    • Resources (intranet)
    • Twitter
  • Research
  • Publications
  • Code
  • Teaching
Logo EPFL, Ecole polytechnique fédérale de Lausanne
Emergencies: +41 21 693 3000 Services and resources Contact Map Webmaster email

Follow EPFL on social media

Follow us on Facebook. Follow us on Twitter. Follow us on Instagram. Follow us on Youtube. Follow us on LinkedIn.
Accessibility Disclaimer Privacy policy

© 2023 EPFL, all rights reserved