Model-based assurance of AI for safety-critical applications
About the Project
The automotive industry is undergoing a radical transformation driven by the pursuit of highly automated driving, promising enhanced safety and accessibility. AI plays a crucial role in realising this vision, enabling vehicles to perceive and react to complex scenarios. Current techniques predominantly rely on deep learning, with a trend towards the use of end-to-end learning, combining multi-model perception, decision-making and control. While these have led to remarkable progress, significant limitations remain. Current models are data-hungry, and their performance often plateaus, suggesting that more data alone is insufficient. Crucially, the black-box nature of deep learning introduces safety concerns, as it is not possible to verify behaviour for all operating conditions. This lack of transparency and the inability to formally verify AI systems cast doubt on the sufficiency of current methodologies for achieving the robustness required for Level 4 autonomy.
Recent work has developed the concept of Structural Causal World Models (SCWMs), which provide a unified representation of both system behaviour and operational context. These models combine a symbolic specification of domain properties, causal semantics, and probabilistic uncertainty. They formally capture aspects of the system relevant to safe operation and form the basis for safety requirements, test case generation, and runtime monitors.
The primary aim of this PhD project is to develop model-based approaches that support the verification and runtime monitoring of AI-based autonomous systems in complex environments. The work is inspired by the principles of System 1 and System 2 thinking. The opaque, data-based machine learning models represent the fast, intuitive System 1. To compensate for residual errors, a reasoning-based approach based on an auditable base of knowledge is required to act as a System 2, validating System 1 actions and reacting to situations outside the scope of learned behaviour.
This thesis will explore the hypothesis that structural causal world models can generate runtime monitors that implement System 2 thinking in a formally verifiable manner. Central to the approach is the development of SCWMs based on automated driving case studies to fulfil two main assurance objectives.
First, test cases can be generated by examining the relationship between the SCWM and hazardous scenarios, focusing on conditions most likely to reveal dangerous behaviour. This includes analytical approaches to identifying rare but critical corner cases. Using probabilistic specifications, test results can be used to extrapolate the residual probability of hazardous failures and provide insights into how the AI-based HAD system can be improved.
Second, SCWMs will be used for runtime monitoring and model adaptation. They will synthesise monitors to detect critical situations or when the system exits its assured operating conditions. This identifies inconsistencies between environmental conditions and system behaviour predicted during design-time to indicate unknown unknowns. These monitors will identify gaps in the SCWMs, allowing them to be improved based on operational data, forming the basis for verifiable improvements of the AI system.
Zou, J., et al. (2026). Structural causal world models for safety assurance of AI-based autonomy. Proceedings of the 41st ACM/SIGAPP Symposium on Applied Computing.
Training and support
You'll receive training and guidance in research, writing and presenting skills to support your development during your PhD. You'll also cover topics such as employability skills, research management and leadership, and graduate teaching assistant training.
In addition, York Graduate Research School works alongside the Department to offer high-quality training, peer-to-peer support, professional development advice, and opportunities to engage others with your research.
Location
Become part of our vibrant community and contribute to inspirational and life-changing research.
You will be based in the Department of Computer Science at the University of York - an exciting and welcoming hub for innovation and collaboration with a modern and inclusive working environment. In our lakeside home on Campus East, you'll benefit from world-class laboratories and collaboration spaces.
The University of York is located a short distance from York city centre. Our historic city is consistently voted as one of the friendliest, safest and best places to live in the UK.
How to apply
Please submit your application online.
Please quote the project title Model-based assurance of AI for safety-critical applications in your application.
Applications close on 30 June 2026 and early application is recommended. If we are impressed by your application, we will invite you to an interview and a decision will be made shortly afterwards.
Entry requirements
- This funded PhD opportunity is open to individuals eligible to pay tuition fees at the UK (Home) rate.
- You should hold or expect to achieve the equivalent of at least a UK upper second class degree in a relevant discipline (or equivalent).
- We are willing to consider your application if you do not fit this profile, providing you are able to demonstrate that you have sufficient computer science knowledge and experience to succeed on the programme.
- We're sorry, on this occasion this opportunity is not available to non-UK students, students from outside the EU, or to individuals who wish to study via distance learning.
Contact us
If you have any questions about this opportunity, please email simon.burton@york.ac.uk
Funding Notes
The project is funded by the Bosch Research Foundation based in Stuttgart, Germany. The successful Phd candidate will have the opportunity to travel to Stuttgart to present their work and network with other Phd Students at an annual symposium. The funding for this Phd is dependent upon final approval of the funder, once a potential candidate has been selected.
Additional Funding Notes:
- You will be paid an annual living allowance at the standard RCUK stipend rate plus a generous enhancement.
- The living allowance will be paid to you in regular instalments, and usually increases each year in line with inflation.
- The studentship will cover postgraduate research fees for the duration of the PhD programme.
- A generous Research Training Support Grant will be provided to support your research-related activities.
Unlock this job opportunity
View more options below
View full job details
See the complete job description, requirements, and application process




