Creates dynamic and thought-provoking lessons.
This comment is not public.
Holger Rauhut is W3 Professor of Mathematics at Ludwig-Maximilians-Universität München (LMU Munich), holding the Chair of Mathematics of Information Processing in the Department of Mathematics within the Faculty of Mathematics, Informatics and Statistics. He also serves as Second Deputy Director of the department. His research focuses on the mathematical foundations of artificial intelligence and machine learning, including convergence theory and implicit regularization for training algorithms, deep learning methods for inverse problems, uncertainty quantification for deep learning, and compressive sensing for reconstructing signals from few measurements using underdetermined systems, l1 minimization, and provably optimal random measurement matrices grounded in high-dimensional probability theory. With expertise in applied harmonic analysis, signal and image processing, and data science, Rauhut's contributions have garnered over 13,000 citations, establishing him as a leading figure in these interdisciplinary fields.
Rauhut completed his Diploma in Mathematics at the Technical University of Munich in 2001, followed by a doctorate in 2004 supervised by Prof. Dr. Rupert Lasser on the thesis 'Time-Frequency and Wavelet Analysis of Functions with Symmetry Properties.' He earned his habilitation in 2008 at the University of Vienna on 'Sparse Recovery.' His academic career includes postdoctoral positions at the University of Wroclaw in 2005 and the University of Vienna from 2005 to 2008, a W2 professorship as Bonn Junior Fellow at the Hausdorff Center for Mathematics, University of Bonn from 2008 to 2013, and a W3 professorship at RWTH Aachen University from 2013 to 2023, where he headed the Chair of Mathematics of Information Processing and served as spokesperson for the DFG Collaborative Research Center SFB 1481 'Sparsity and Singular Structures' from 2022 to 2023. Since August 2023, he has been at LMU Munich. Key honors include the European Research Council Starting Grant 'Sparse and Low Rank Recovery' awarded in 2010. Representative publications include 'Uncertainty Quantification for Sparse Fourier Recovery' (2023), 'Convergence of Gradient Descent for Learning Linear Neural Networks' (2024), 'Non-Asymptotic Uncertainty Quantification in High-Dimensional Learning' (NeurIPS 2024), and works on compressive covariance sampling and structured compressive sensing via neural networks.
