Biomedical Imaging Group
Logo EPFL
    • Splines Tutorials
    • Splines Art Gallery
    • Wavelets Tutorials
    • Image denoising
    • ERC project: FUN-SP
    • Sparse Processes - Book Preview
    • ERC project: GlobalBioIm
    • The colored revolution of bioimaging
    • Deconvolution
    • SMLM
    • One-World Seminars: Representer theorems
    • A Unifying Representer Theorem
Follow us on Twitter.
Join our Github.
Masquer le formulaire de recherche
Menu
BIOMEDICAL IMAGING GROUP (BIG)
Laboratoire d'imagerie biomédicale (LIB)
  1. School of Engineering STI
  2. Institute IEM
  3.  LIB
  4.  Student Projects
  • Laboratory
    • Laboratory
    • Laboratory
    • People
    • Jobs and Trainees
    • News
    • Events
    • Seminars
    • Resources (intranet)
    • Twitter
  • Research
    • Research
    • Researchs
    • Research Topics
    • Talks, Tutorials, and Reviews
  • Publications
    • Publications
    • Publications
    • Database of Publications
    • Talks, Tutorials, and Reviews
    • EPFL Infoscience
  • Code
    • Code
    • Code
    • Demos
    • Download Algorithms
    • Github
  • Teaching
    • Teaching
    • Teaching
    • Courses
    • Student projects
  • Splines
    • Teaching
    • Teaching
    • Splines Tutorials
    • Splines Art Gallery
    • Wavelets Tutorials
    • Image denoising
  • Sparsity
    • Teaching
    • Teaching
    • ERC project: FUN-SP
    • Sparse Processes - Book Preview
  • Imaging
    • Teaching
    • Teaching
    • ERC project: GlobalBioIm
    • The colored revolution of bioimaging
    • Deconvolution
    • SMLM
  • Machine Learning
    • Teaching
    • Teaching
    • One-World Seminars: Representer theorems
    • A Unifying Representer Theorem

Students Projects

Proposals  On-Going  Completed  

Learning of Continuous and Piecewise-linear Functions by Mutli-resolution Algorithms

Available
Master Semester Project
Master Diploma
Project: 00438

00438

ReLU neural networks generate continuous and piecewise-linear (CPWL) mappings. Some other methods rely on learning CPWL functions with local parameterizations in low-dimensional cases. In these methods, a regularization with sparsity effects is used to promote simpler functions. Such methods are interpretable, and their performance is comparable with ReLU networks. Their computational bottleneck is solving the optimization problem of learning. In this project, we focus on learning CPWL functions in two-dimensional cases. We use box splines (some piecewise linear polynomials) on a grid for parameterizing the functions. We will have multiple grids with different grid sizes. First, we solve the optimization problem on coarser grids and sample it on finer ones. We continue learning on the finer grids to get better results. Our goal is to obtain a solution with the same loss faster than the one obtained by solving on one fine grid. The student should be familiar with the basics of optimization and notions like sparsity. In addition, Python programming is required.

Having a look at this paper is helpful: 10.1109/OJSP.2021.3136488

The photo is from the paper: https://onlinelibrary.wiley.com/doi/10.1111/j.1467-8659.2011.01853.x.

  • Supervisors
  • Mehrsa Pourya, mehrsa.pourya@epfl.ch
  • Michael Unser, michael.unser@epfl.ch
  • Laboratory
  • Research
  • Publications
  • Code
  • Teaching
    • Courses
    • Student projects
Logo EPFL, Ecole polytechnique fédérale de Lausanne
Emergencies: +41 21 693 3000 Services and resources Contact Map Webmaster email

Follow EPFL on social media

Follow us on Facebook. Follow us on Twitter. Follow us on Instagram. Follow us on Youtube. Follow us on LinkedIn.
Accessibility Disclaimer Privacy policy

© 2023 EPFL, all rights reserved