A true inspiration to all learners.
Professor Fred Roosta is a Professor in the School of Mathematics and Physics at the University of Queensland. He earned his Doctor of Philosophy from the University of British Columbia, where he was a PhD student from 2010 to 2015 under the supervision of Professor Uri Ascher. Following his PhD, Roosta held a postdoctoral fellowship at the University of California, Berkeley from 2015 to 2017 under Professor Michael Mahoney. He joined the University of Queensland in 2017 as a Lecturer, advancing through Senior Lecturer and Associate Professor roles to his current position as Professor by 2024. His research specializations include machine learning, numerical optimization, randomized algorithms, computational statistics, numerical analysis, and scientific computing. Key fields of research encompass applied mathematics, information and computing sciences, mathematical sciences, numerical analysis, numerical and computational mathematics, operations research, optimisation, statistical theory, statistics, and theory of computation.
Roosta has been awarded the Australian Research Council Discovery Early Career Researcher Award for Efficient Second-Order Optimisation Algorithms for Learning from Big Data (2018–2024). Additional grants include Next Generation Newton-type Methods with Minimum Residual Solver (ARC Discovery Projects, 2025–2028), ARC Training Centre for Information Resilience (2021–2026), and CropVision: A next-generation system for predicting crop production (ARC Linkage Projects, 2021–2025). His key publications feature "Robust and interpretable prediction of gene markers and cell types from spatial transcriptomics data" (Nature Communications, 2026), "Obtaining Pseudoinverse Solutions with MINRES" (SIAM Journal on Matrix Analysis and Applications, 2025), "Complexity guarantees for nonconvex Newton-MR under inexact Hessian information" (IMA Journal of Numerical Analysis, 2025), "DINGO: Distributed Newton-type method for gradient-norm optimization" (Advances in Neural Information Processing Systems, 2019), "Newton-type methods for non-convex optimization under inexact Hessian information" (Mathematical Programming, 2020), and "Implicit Langevin algorithms for sampling from log-concave densities" (Journal of Machine Learning Research, 2021). With over 2,500 citations on Google Scholar, his contributions significantly influence optimization and machine learning. Roosta supervises numerous PhD students as principal or associate advisor on projects involving stochastic optimization, non-convex optimization, interpretable AI, and scientific machine learning, with several completions including theses on Newton-MR methods and second-order optimization for machine learning.