
Helps students build confidence and skills.
Manfred Warmuth is a Professor Emeritus in the Department of Computer Science and Engineering at the University of California, Santa Cruz, part of the Baskin School of Engineering. His research interests include online learning, machine learning, large language models, statistical decision theory, game theory, and analysis of algorithms. As a professor of Computer Science, his primary research focus has been the development of machine learning algorithms. He earned his Ph.D. in 1981 from the University of Colorado at Boulder, with a dissertation titled "Scheduling on Profiles of Constant Breadth," supervised by Hal Gabow.
Warmuth is one of the founding members of the computational learning theory (COLT) community and a pioneer in the field. He developed key online learning algorithms such as the Weighted Majority algorithm and the Exponentiated Gradient algorithm. His work includes deriving online algorithms based on Bregman divergences, making them robust to changing data, contrasting additive versus multiplicative updates, and extending multiplicative updates to density matrices with a Bayesian probability calculus for matrices. Highly influential publications include "The weighted majority algorithm" (1994), "Learnability and the Vapnik-Chervonenkis dimension" (1989), "Online passive-aggressive algorithms" (2006), "Occam's razor" (1987), "Exponentiated gradient versus gradient descent for linear predictors" (1997), "How to use expert advice" (1997), and "Tracking the best expert" (1998). These papers have garnered thousands of citations and shaped modern machine learning. In 2021, he was elected to the German National Academy of Sciences Leopoldina. Currently serving as a Senior Researcher at Google Brain, Warmuth's contributions continue to impact theoretical machine learning, including posing open problems like the sample compression conjecture.
