High-Dimensional Gaussian Sampling: Cholesky versus Markov
Available
Master Semester Project
Master Diploma
Project: 00468

Monte Carlo methods are indispensable for modern machine learning, with high-dimensional Gaussian sampling serving as their most frequent and computationally demanding sub-task. This project investigates the efficiency and trade-offs of three primary paradigms for generating these samples: exact numerical linear algebra based on Cholesky factorization, iterative Markov Chain Monte Carlo methods, and Perturb-and-Map approaches. While Cholesky-based solvers provide exact samples, they often become computationally prohibitive as dimensionality increases, forcing a shift toward MCMC algorithms that trade exactness for scalability. We also explore Perturb-and-Map techniques, which recast the sampling problem as a faster optimization task by injecting noise into the model's potentials. By implementing and comparing these competing strategies, we aim to determine the optimal balance between computational speed and statistical accuracy for large-scale inverse problems.
- Supervisors
- Martin Zach, martin.zach@epfl.ch, BM 4 140