Brings real-world insights to the classroom.
This comment is not public.
Dr. Xingyu Zhao is Associate Professor in Safety-Critical AI Systems at Warwick Manufacturing Group (WMG), University of Warwick. He also serves as an honorary lecturer in the Department of Computer Science. Zhao earned his PhD in Computer Science from City, University of London in 2016, with a thesis focused on software reliability. Prior to joining WMG in June 2023, he held the position of Lecturer in the Department of Computer Science at the University of Liverpool from January 2021 to June 2023. Before that, he was a Research Associate in the School of Engineering and Physical Sciences at Heriot-Watt University from May 2018 to January 2021. His career trajectory reflects a commitment to advancing reliability and safety in software-intensive systems.
Zhao's research specializes in safety assurance for learning-enabled systems, particularly autonomous robots and vehicles. His academic interests encompass probabilistic model checking, Bayesian inference for verification, robustness evaluation of deep learning models, and runtime monitoring for AI safety. Key publications include 'Bayesian learning for the robust verification of autonomous robots' (2024, cited 16 times), 'Hierarchical Distribution-Aware Testing of Deep Learning' (2023, cited 19 times), 'Adversarial Training for Probabilistic Robustness' (2025), 'Hyper-CycleGAN: A New Adversarial Neural Network' (2025), and 'A case study of STPA using ChatGPT' (2025, cited 42 times). In January 2025, he received the prestigious EPSRC New Investigator Award for the project 'Harnessing Synthetic Data Fidelity for Assured Perception of Autonomous Vehicles'. With over 2,800 citations on Google Scholar and expertise in formal verification, reliability engineering, and software testing, Zhao significantly influences the field of safe AI. His work contributes to dependable autonomous systems through innovative approaches to testing and verification.
