We are seeking a full-time Postdoctoral Research Assistant to join Torr Vision Group at the Department of Engineering Science, central Oxford. The post is funded by EPSRC and is fixed-term to the 4th February 2026.
You will be investigating the safety and security implications of large language model (LLM) agents, particularly those capable of interacting with operating systems and external APIs. This project explores the emerging risk landscape posed by the deployment of autonomous LLMs that can execute tasks, manipulate system settings, and interface with web services—moving beyond mere text generation to real-world actions. This is an exciting opportunity to contribute to foundational work at the intersection of AI safety and autonomous system deployment.
You will be responsible for designing experiments to evaluate and stress-test LLM agents in system-level environments. Studying adversarial threats and hijack scenarios. Developing practical safeguards and certification strategies to ensure safe execution of agentic behaviour and publishing high-impact research.
Candidates should possess a PhD (or be near completion) in PhD in Computer Science, AI, Security, or a related field. You will have a Strong background in machine learning and/or computer security and Experience working with LLMs or agent-based systems.
Informal enquiries may be addressed to [email protected]
For more information about working at the Department, see www.eng.ox.ac.uk/about/work-with-us/
Only online applications received before midday on the 22nd August 2025 can be considered. You will be required to upload a covering letter/supporting statement, including a brief statement of research interests (describing how past experience and future plans fit with the advertised position), CV and the details of two referees as part of your online application.
The Department holds an Athena Swan Bronze award, highlighting its commitment to promoting women in Science, Engineering and Technology
Report job