Mesh simplification based on Hessian total variation
2022
Master Semester Project
Project: 00422
ReLU neural networks are powerful learning models with applications in many fields. It is known that they generate continuous and piecewise linear (CPWL) mappings. However, their compositional structure prevents us from controlling the mapping they produce. In this project, we instead parametrize CPWL functions with the intuitive local representation that relies on a fixed triangulation in the input space. This allows to enforce a sparsity constraint during the learning stage: among well-performing models, we favor the ones with the fewer affine pieces. However, current sparse solutions are still defined over the initial triangulation and are computationally equivalent to the non-sparse ones. In this project, the student will investigate in low dimensions (starting in 2D) different methods to simplify the supporting triangulation of a sparse CPWL function. The student should have good Python skills and have some notions in optimization.
- Supervisors
- Alexis Goujon, alexis.goujon@epfl.ch, BM 4.139
- Mehrsa Pourya, mehrsa.pourya@epfl.ch, BM 4.139
- Michael Unser, michael.unser@epfl.ch, 021 693 51 75, BM 4.136