The Algorithmic Mirror: Identity, Perception and Trust in the Age of AI (Ref: CO/OB-SF2/2026)
About the Project
Algorithms are no longer just tools, they are lenses that shape how we see the world and ourselves. Every recommendation, ranking or personalised feed reflects back fragments of our behaviour, creating what can be thought of as an algorithmic mirror. This PhD explores how algorithmic curation influences self-perception, trust, and decision making in digital environments.
You will examine how individuals interpret and respond to algorithmic systems, how these systems reinforce or distort identity, and what this means for digital trust and autonomy. Depending on your background, the research could involve developing prototype recommender or visualisation systems, analysing social media data, or designing interactive AI interfaces that reveal how algorithms represent users.
As AI systems increasingly mediate everyday choices, from news to relationships, understanding their influence is vital. Few users appreciate how these models reflect bias or feedback loops built from their data. This project contributes to creating more transparent, ethical, and human-centred AI, informing both technology design and digital literacy initiatives.
Your work will have real world relevance to cybersecurity, human-computer interaction, and digital ethics, supporting society’s shift from passive algorithmic consumption to active awareness and trust.
You will join Loughborough University’s interdisciplinary cyber security research community, working at the intersection of technology, human-computer interaction, and design. Supervision will be led by Professor Oli Buckley (cyber security, digital trust, and human interaction with emerging tech), supported by a cross-disciplinary team with expertise in human factors, AI ethics, and narrative design. You will gain transferable skills in computational and qualitative research, public engagement, and critical analysis of emerging technologies.
Name of primary supervisor/CDT lead:
Prof. Oli Buckley o.buckley@lboro.ac.uk
https://www.lboro.ac.uk/departments/compsci/staff/oli-buckley
Entry requirements:
Applicants should have, or expect to achieve, at least a 2:1 bachelor’s degree (or equivalent international qualification) in computer science, design, psychology, media, or a related field. A relevant master’s degree or experience with UX research, AI systems, or data visualisation would be an advantage.
English language requirements:
Applicants must meet the minimum English language requirements. Further details are available on the international website (http://www.lboro.ac.uk/international/applicants/english/).
Bench fees required: No
Closing date of advert: 1st July 2026
Start date: April 2026, October 2026
Full-time/part-time availability: Full-time 3 years, Part-time 6 years
Fee band: UK £5,006, International £28,600
How to apply:
All applications should be made online. Under ‘Campus’, please select ‘Loughborough’ and select the ‘Programme’ as ‘School of Science/Computer Science’. Please quote the advertised reference number, ‘CO/OB-SF2/2026’, in your application.
To avoid delays in processing your application, please ensure that you submit a CV and the minimum supporting documents.
The following selection criteria will be used by academic schools to help them make a decision on your application. Please note that this criteria is used for both funded and self-funded projects.
Please note that applications for this project are considered on an ongoing basis once submitted and the project may be withdrawn prior to the application deadline if a suitable candidate is chosen for the project.
Project search terms:
artificial intelligence, cyber security, ethics, human computer interaction, machine learning, trust, digital identity, algorithms, bias, transparency
Email Address Sci:
Unlock this job opportunity
View more options below
View full job details
See the complete job description, requirements, and application process


