Academic Jobs Logo
Post My Job Jobs

LogicLLMs: Empowering LLMs with Logical Reasoning

Applications Close:

Post My Job

Aberdeen, United Kingdom

Academic Connect
5 Star Employer Ranking

LogicLLMs: Empowering LLMs with Logical Reasoning

About the Project

These projects are open to students worldwide, but have no funding attached. Therefore, the successful applicant will be expected to fund tuition fees at the relevant level (home or international) and any applicable additional research costs. Please consider this before applying.

The proliferation of LLM-based tools such as ChatGPT, GitHub Copilot, Adobe Firefly, and others, in several application domains, including education, healthcare, manufacturing, transport and finance, among others, has been met with excitement and caution in equal measure, particularly as it relates to the accuracy, reliability, trustworthiness and alignment of these tools to the established ethical, legal and social principles. LLM-based tools are advanced artificial intelligence (AI) systems trained on vast amounts of data to understand and generate original work, such as articles, code, paintings, poems, or videos. They are used in i) healthcare for medical diagnosis by analysing patient data, medical histories and literature, ii) finance for fraud detection by analysing transaction data to detect unusual patterns and potential fraud in real-time, iii) marketing and advertising to create personalised marketing content, social media posts and advertisements tailored to target audiences and iv) education and research for AI-powered tutoring, content creation, literature review and study design among others.

However, an ongoing debate is raging over whether these tools i) have thoughtful humanlike reasoning abilities [1], ii) can produce trustworthy outputs in specific contexts [2], or iii) have sufficient understanding of humans to complement/cooperate with humans. LLMs rely primarily on memorisation and pattern-matching rather than actual reasoning. Hence, they can’t be trusted to perform well on “out of distribution” tasks (i.e., tasks that differ from what the models are trained for). This has implications for the accuracy and trustworthiness of their outputs in specific contexts. To address these issues, LLM-based tools need robust domain-independent reasoning capabilities. For example, there is a need to build LLM-based tools that can i) infer contexts not explicitly stated in users’ prompts but might affect the outputs produced, ii) initiate dialogue with users or other tools to understand users’ motivations and contexts relevant to prompts and outputs and iii) put forward arguments in favour and against their outputs to provide a balanced account of different factors that can affect outputs.

The main objective of this project is to design and develop a novel LLM-based AI agent that can reason over users’ prompts and generate outputs based on that reasoning. The agent will use different explainable AI and computational argumentation techniques to i) explain their outputs, ii) suggest possible compromise/counter solutions, and iii) initiate a dialogue with users or other tools to understand users’ contexts better and generate outputs based on such contexts. For example, the agent will check whether a user’s prompt can be interpreted in different ways and create a response for each interpretation. Reasoning in this context is an umbrella term that encompasses abilities in induction, abduction, analogy, common sense, and other ‘rational’ or systematic methods for solving problems and arriving at a well-supported stance. It is often a process that involves composing multiple steps of inference. The project will advance the state of the art in explainable and responsible AI, user-centred AI design, and trust-building, as well as AI-supported mediation and conflict-resolution techniques.

Informal enquiries can be made by contacting Dr G Ogunniye (g.ogunniye@abdn.ac.uk).

Decisions will be based on academic merit. The successful applicant should have, or expect to obtain, a UK Honours Degree at 2.1 (or equivalent) in Computing Science.

We encourage applications from all backgrounds and communities, and are committed to having a diverse, inclusive team.

Application Procedure:

Formal applications can be completed online: https://www.abdn.ac.uk/pgap/login.php.

You should apply for Degree of Doctor of Philosophy in Computing Science to ensure your application is passed to the correct team for processing.

Please clearly note the name of the lead supervisor and project titleon the application form. If you do not include these details, it may not be considered for the project.

Your application must include: A personal statement, an up-to-date copy of your academic CV, and clear copies of your educational certificates and transcripts.

Please note: you do not need to provide a research proposal with this application.

If you require any additional assistance in submitting your application or have any queries about the application process, please don't hesitate to contact us at researchadmissions@abdn.ac.uk

Funding Notes

This is a self-funding project open to students worldwide. Our typical start dates for this programme are February or October.

Fees for this programme can be found here Finance and Funding | Study Here | The University of Aberdeen.

10

Unlock this job opportunity


View more options below

View full job details

See the complete job description, requirements, and application process

72 Jobs Found
View More