Embodied Multisensory Navigation for Small Aerial Robots
About the Project
Small aerial robots can reach places that are difficult or unsafe for humans, but reliable navigation remains challenging when GPS is unavailable and vision is degraded by low light, textureless surfaces, dust, smoke, or clutter. This PhD project will develop a micro aerial robot that navigates robustly by combining bio-inspired, multimodal sensing (e.g., optical flow) to perceive nearby obstacles, motion, and free space across diverse environments.
The research will investigate how multimodal sensing complementary modalities can be integrated into a unified perception-and-control stack under stringent size, weight, power, and compute constraints. Key objectives include: (i) designing lightweight sensing hardware and integration strategies for micro platforms; (ii) developing real-time algorithms for motion estimation and obstacle detection; (iii) creating multimodal fusion methods that adapt sensor weighting based on context and uncertainty (e.g., vision-dominant in well-lit textured scenes); and (iv) demonstrating closed-loop navigation behaviors such as wall-following, corridor traversal, obstacle or gap detection, and safe landing/perching.
The approach will combine modelling and learning: physics-informed signal processing for flow features, probabilistic state estimation for uncertainty-aware fusion, and learning-based policies where appropriate to handle complex interactions. Experimental validation will include systematic benchmarking in indoor and semi-structured settings (e.g., corridors, cluttered rooms, vegetation-like obstacles) and validation tests under degraded visual conditions.
Funding Notes
3.5 year University Scholarship - Minimum tax-free stipend at the current UKRI rate (for 2025/26 standard stipend is £20,780, RTSG £7,000, full Home Tuition Fee covered).
Unlock this job opportunity
View more options below
View full job details
See the complete job description, requirements, and application process











