Biomedical Imaging Group
Logo EPFL
    • Splines Tutorials
    • Splines Art Gallery
    • Wavelets Tutorials
    • Image denoising
    • ERC project: FUN-SP
    • Sparse Processes - Book Preview
    • ERC project: GlobalBioIm
    • The colored revolution of bioimaging
    • Deconvolution
    • SMLM
    • One-World Seminars: Representer theorems
    • A Unifying Representer Theorem
Follow us on Twitter.
Join our Github.
Masquer le formulaire de recherche
Menu
BIOMEDICAL IMAGING GROUP (BIG)
Laboratoire d'imagerie biomédicale (LIB)
  1. School of Engineering STI
  2. Institute IEM
  3.  LIB
  4.  Student Projects
  • Laboratory
    • Laboratory
    • Laboratory
    • People
    • Jobs and Trainees
    • News
    • Events
    • Seminars
    • Resources (intranet)
    • Twitter
  • Research
    • Research
    • Researchs
    • Research Topics
    • Talks, Tutorials, and Reviews
  • Publications
    • Publications
    • Publications
    • Database of Publications
    • Talks, Tutorials, and Reviews
    • EPFL Infoscience
  • Code
    • Code
    • Code
    • Demos
    • Download Algorithms
    • Github
  • Teaching
    • Teaching
    • Teaching
    • Courses
    • Student projects
  • Splines
    • Teaching
    • Teaching
    • Splines Tutorials
    • Splines Art Gallery
    • Wavelets Tutorials
    • Image denoising
  • Sparsity
    • Teaching
    • Teaching
    • ERC project: FUN-SP
    • Sparse Processes - Book Preview
  • Imaging
    • Teaching
    • Teaching
    • ERC project: GlobalBioIm
    • The colored revolution of bioimaging
    • Deconvolution
    • SMLM
  • Machine Learning
    • Teaching
    • Teaching
    • One-World Seminars: Representer theorems
    • A Unifying Representer Theorem

Students Projects

Proposals  On-Going  Completed  

Deep neural networks: learning with splines

Spring 2017
Master Semester Project
Master Diploma
Project: 00326

00326
A recent paper (Poggio et al, 2015) points out that deep neural networks with RELU activation functions can be interpreted as hierarchical splines. The purpose of this project is to exploit this connection in order to gain further understanding and to improve the performance of such networks. Following a formal statement of the problem, the project will consist in an extensive (but informed) experimental comparison of different network configurations in order to determine the most promising one. The idea is to keep the number of parameter fixed (total number of RELUs and linear weights) and to investigate the effect of the architectures-in particular, the number of layers-on the prediction error. This project gives an excellent opportunity to deeply understand fundamental aspects of deep learning networks.
  • Supervisors
  • Anaïs Badoual, anais.badoual@epfl.ch, 31136, BM 4142
  • Michael Unser, michael.unser@epfl.ch, 021 693 51 75, BM 4.136
  • Shayan Aziznejad
  • Laboratory
  • Research
  • Publications
  • Code
  • Teaching
    • Courses
    • Student projects
Logo EPFL, Ecole polytechnique fédérale de Lausanne
Emergencies: +41 21 693 3000 Services and resources Contact Map Webmaster email

Follow EPFL on social media

Follow us on Facebook. Follow us on Twitter. Follow us on Instagram. Follow us on Youtube. Follow us on LinkedIn.
Accessibility Disclaimer Privacy policy

© 2023 EPFL, all rights reserved