Academic Jobs Logo
University of Southampton Jobs

Autonomous Close-Proximity Spacecraft Operations: Hardware-in-the-Loop Validation of Neuromorphic Event Cameras

Applications Close:

University of Southampton

University Rd, Southampton SO17 1BJ, UK

Academic Connect
5 Star Employer Ranking

Autonomous Close-Proximity Spacecraft Operations: Hardware-in-the-Loop Validation of Neuromorphic Event Cameras

About the Project

Supervisory Team: Dr Alexander Wittig, Dr Sergio Araujo-Estrada and Dr Jorn Cheney

This project aims to overcome the challenges of testing spacecraft control algorithms on real hardware in Earth-based environments. Using our new Aerospace Robotics Control & Simulation (ARCS) facility, we will investigate specifically how bio-inspired neuromorphic event cameras enable resilient, low-compute, autonomous spacecraft proximity operation, and we will experimentally validate the system under realistic space-like conditions using our KUKA KR10 robot arm.

Autonomous close-proximity operations (CPO) between spacecraft are critical for space situational awareness (SSA), inspection of uncooperative targets, and in-orbit servicing. However, spacecraft must operate in one of the most extreme environments encountered by autonomous systems:

  • severe lighting transitions
  • high dynamic range glare
  • radiation-constrained computation
  • limited control authority
  • communication-denied conditions

While robust relative navigation and control algorithms exist in the literature, most assume idealised sensing and high-performance computing. A major scientific and technological gap remains between theoretical guidance, navigation and control (GNC) frameworks and experimentally validated, radiation-compatible hardware implementations. In particular, conventional frame-based cameras require significant computational resources for continuous feature extraction and pose estimation.

This research focuses on:

  • Resilient 6-DOF relative navigation and control: You'll develop a nonlinear relative motion framework for an autonomous chaser spacecraft approaching a non-cooperative target. High-order Extended/Unscented Kalman Filters will fuse inertial sensing with event-based visual pose estimation for an end-to-end on-board state estimation pipeline. Adaptive trajectory planning and constraint-aware attitude control will be implemented to ensure collision avoidance under state uncertainty.
  • Neuromorphic perception integration: You'll integrate event cameras, which asynchronously detect pixel-level brightness changes, to enable high-speed, low-latency perception with dramatically reduced data rates. Algorithms for event-based feature tracking, pose estimation, and optical flow will be investigated and compared against conventional frame-based methods in terms of accuracy, power consumption, and computational load.
  • Hardware-in-the-loop (HIL) validation: You'll build a representative spacecraft test model - including event camera, onboard embedded processor, and power unit - as a “partial” physical twin. Using our robotic arm, you will develop a simulation of 6-DOF orbital motion, covering realistic space conditions such as eclipse transitions, glare, sensor dropout, and processor throttling. You'll quantify system resilience, graceful degradation, and recovery performance as part of the verification.

Entry requirements

You must have a UK first-class Master's degree, or its international equivalent, in engineering, math or a related field. A degree in astronautics or robotics is a benefit.

Demonstrated detailed knowledge in at least one of orbital mechanics, robotics, and control theory is essential.

Desirable skills:

  • expertise in hardware programming, electronics development (eg ESP32, ARM, Arduino), and event cameras
  • prior research experience

This is part of a DSTL (Defence Science and Technology Laboratory) funded research project "Robotics and Autonomous Systems for Defence".

Candidates must be UK nationals and pass DSTL vetting.

This is an in-person project that requires physical presence and cannot be done remotely or part-time.

Fees and funding

This position is fully funded. Tuition fees will be paid and you'll receive an enhanced stipend of £25,000 per year (about 25% above the EPSRC rate), with a 5% annual increase in subsequent years.

How to apply

Apply now

You need to:

  • choose programme type (Research), 2026/27, Faculty of Engineering and Physical Sciences
  • select Full time
  • search for programme PhD Engineering & the Environment (7175)
  • add name of the supervisor in section 2 of the application

Applications should include:

  • your CV (resumé)
  • 2 academic references
  • degree transcripts and certificates to date
  • English language qualification (if applicable)
10

Unlock this job opportunity


View more options below

View full job details

See the complete job description, requirements, and application process

3 Jobs Found
View More