Academic Jobs Logo
Post My Job Jobs

Human–Wearable–Robot Interaction: Enabling Seamless Control of Mobile Manipulators via Biosensing Interfaces

Applications Close:

Post My Job

Aberdeen, United Kingdom

Academic Connect
5 Star Employer Ranking

Human–Wearable–Robot Interaction: Enabling Seamless Control of Mobile Manipulators via Biosensing Interfaces

About the Project

These projects are open to students worldwide, but have no funding attached. Therefore, the successful applicant will be expected to fund tuition fees at the relevant level (home or international) and any applicable additional research costs. Please consider this before applying.

Human–wearable–robot interaction is emerging as a frontier field that bridges human sensing, artificial intelligence, and robotic control. This PhD project aims to develop an intelligent biosensing-based interface that enables seamless and intuitive interaction between humans and mobile robotic manipulators. By integrating wearable health monitoring technologies (such as surface electromyography, inertial motion tracking, and physiological sensors) with AI-driven control frameworks, this research will establish a natural and adaptive communication channel between humans and robots.

The project will focus on the design, development, and evaluation of a multimodal wearable sensing system capable of interpreting human motion intentions and physiological states in real time. These biosignals will be processed through machine learning models to control a TurtleBot3 mobile platform equipped with an OpenMANIPULATOR-X robotic arm, enabling responsive and context-aware human–robot collaboration. The system will also feature bidirectional feedback mechanisms, allowing users to receive haptic or visual cues from the robot, thus achieving co-adaptive and safe interaction.

A distinctive aspect of this research is the integration of wearable health devices with mobile robotic manipulators, bridging human physiological sensing and robotic actuation. Traditional robot control methods often rely on manual commands or pre-defined motion sequences, limiting flexibility and intuitiveness. In contrast, this project envisions a human-centred, learning-enabled robotic framework capable of adapting to the user’s intent, behaviour, and environment. Experimental validation will involve tasks such as assisted manipulation, rehabilitation exercises, and collaborative navigation.

Students undertaking this PhD will gain interdisciplinary expertise in wearable sensor design, biosignal processing, artificial intelligence, and robotic systems engineering. The laboratory is equipped with TurtleBot3 mobile robotic platforms, OpenMANIPULATOR-X robotic arms, and wearable health monitoring devices, providing full hardware and software support for embedded AI, signal processing, and real-time robotic experimentation.

The candidate will have opportunities to engage in international collaboration and research exchange with partner institutions such as University College London (UCL), Imperial College London, and the University of Oxford. These collaborations will provide access to advanced robotic laboratories, joint supervision, and cross-institutional research seminars, supporting the student’s development as an independent researcher in human–robot interaction and intelligent healthcare technologies.

The expected outcomes of this research include:

  • A working prototype that demonstrates seamless wearable–robot interaction.
  • Novel AI algorithms for intent recognition and adaptive robotic control.
  • Experimental validation of human–robot co-adaptation in dynamic environments.

This project aligns with the global trend toward human-centered robotics and digital healthcare innovation, contributing to the next generation of intelligent systems that seamlessly integrate technology with human capability.

Informal enquiries can be directed to Dr Zhenhua Yu (zhenhua.yu@abdn.ac.uk) if the topic within this project is of interest to you, he is happy to discuss further with interested candidates.

Decisions will be based on academic merit. The successful applicant should have, or expect to obtain, a UK Honours Degree at 2.1 (or equivalent) in obotics, mechanical engineering, electronic and electrical engineering, biomedical engineering, computer science, or a closely related discipline.

Candidates should have a strong interest in human–robot interaction, wearable sensing, and intelligent control systems.

Prior experience with Python, MATLAB, or C/C++, and familiarity with ROS (Robot Operating System) will be highly desirable. A good understanding of experimental research methods and data analysis is expected.

We encourage applications from all backgrounds and communities, and are committed to having a diverse, inclusive team.

Application Procedure:

Formal applications can be completed online: https://www.abdn.ac.uk/pgap/login.php.

You should apply for Degree of Doctor of Philosophy in Computing Science to ensure your application is passed to the correct team for processing.

Please clearly note the name of the lead supervisor and project titleon the application form. If you do not include these details, it may not be considered for the project.

Your application must include: A personal statement, an up-to-date copy of your academic CV, and clear copies of your educational certificates and transcripts.

Please note: you do not need to provide a research proposal with this application.

If you require any additional assistance in submitting your application or have any queries about the application process, please don't hesitate to contact us at researchadmissions@abdn.ac.uk

Funding Notes

This is a self-funding project open to students worldwide. Our typical start dates for this programme are February or October.

Fees for this programme can be found here Finance and Funding | Study Here | The University of Aberdeen

10

Unlock this job opportunity


View more options below

View full job details

See the complete job description, requirements, and application process

24 Jobs Found
View More