MPLS Researcher Conference: AI & Ethics
Join us for a one-day, in-person conference hosted by the Mathematical, Physical and Life Sciences (MPLS) Division, bringing together researchers, technicians, and research enablers to explore how artificial intelligence is shaping scientific inquiry—and the ethical questions that arise.
This event is a showcase for cutting-edge research across the MPLS Division, with opportunities for interdisciplinary exchange, networking, and collaboration.
Date: Thursday, 15 January 2026
Time: 09:30 – 16:30
Location: Mathematical Institute, University of Oxford
Agenda: Click here to see an agenda of the day.
Deadline for registration - 19th December.
Don’t miss the chance to hear from an exceptional line-up of speakers.
The conference will open with a welcome address from the Vice Chancellor, Professor Irene Tracey and the programme includes talks from leading experts:
- Professor Tom Stoneham – University of York & UKRI AI CDT in Safe AI Systems.
- Dr Heloise Stevance – Schmidt AI in Science Fellow, University of Oxford.
- Professor Gemma Derrick – AI, Ethics & Research Culture, University of Bristol.
- Dr Caroline Green - AI & Ethics Institute, University of Oxford.
Plus, engage with the roundtable discussion on the future of responsible AI in science, hosted by Professor Jim Naismith, Head of MPLS Division.
Speaker spotlight
![]() |
Professor Tom StonehamTom Stoneham is Professor of Philosophy and Ethics Lead for the UKRI Centre for Doctoral Training in Safe AI Systems at the University of York. He has been Head of Department three times and was inaugural Dean of the York Graduate Research School. He also convenes the MA in Applied Ethics and Governance of Data Privacy. Tom is a regular speaker and expert advisor in the UK and Europe on the social, political and environmental issues arising from the current trajectory of AI implementation. He has published on many areas from early modern philosophy to dreaming and trauma. His current research focuses on non-perfectionist ethical theory. |
![]() |
Dr Heloise StevanceDr. Heloise Stevance is an astrophysicist whose research bridges sky surveys, stellar explosions and artificial intelligence. As a Schmidt AI in Science Fellow, Heloise now designs and builds automated systems to discover these cosmic events in real time, such as the Virtual Research Assistant for the ATLAS Sky Survey. Awarded the 2024 Caroline Herschel Lectureship Prize for their early career contributions to the field, she is now focusing on creating the automated systems of tomorrow for the forthcoming Legacy Survey of Space and Time. Their applied AI practice emphasizes a science-driven, rather than market-driven, approach to ensure a robust scientific legacy for the dataset created with the help of machine learning systems. |
Professor Gemma DerrickProfessor Gemma Derrick is a meta-research scholar at the University of Bristol’s School of Education and the Centre for Higher Education Transformations. Her work focuses on research culture, researcher behaviour, peer review, and assessing societal impact, with influential analyses of the UK’s Research Excellence Framework and other national audits. She has published widely and advised funders internationally. Gemma co-leads initiatives such as HiddenREF and the Embedding Trust in Evaluation programme, and serves as a Visiting Professor at the University of Oslo. https://research-information.bris.ac.uk/en/persons/gemma-derrick |
|
© Ian Wallman |
Dr Caroline GreenDr Caroline Green is the Institute’s Director of Research and Head of Public Engagement, leading the Institute's Accelerator Fellowship Programme. Caroline's research focuses on AI and human rights, specifically in the fields of health and social care. Caroline holds a LLB (Hons) from the University of Edinburgh, an MSc in Human Rights from the LSE, a MA in Investigative Journalism from City University and a PhD in Gerontology from King's College London. Caroline is rapidly becoming one of UK’s leading voices on the responsible use of artificial intelligence in adult social care. Whilst acknowledging that AI can benefit humanity, she maintains the importance of ensuring the responsible development, use and roll out of the technology across people’s lives. The University of Oxford has proudly shared a profile of her and her work in this field: “Dr Caroline Green is keeping social care human in the age of AI". www.oxford-aiethics.ox.ac.uk/caroline-emmer-de-albuquerque-green |
|
Time |
Activity |
Venue |
|
09.30-10.00 |
Arrival and Coffee |
Mezzanine |
|
10.00-10.10 |
Vice-Chancellor, Professor Irene Tracey, CBE– Welcome |
Plenary: Lecture Theatre |
|
10.10-10.30 |
Professor Tom Stoneham Professor of Philosophy, University of York & UKRI AI CDT in Safe AI Systems – AI & Ethics |
|
|
10.30-10.50 |
Dr Heloise Stevance Schmidt A.I. in Science Fellow. Astrophysics, University of Oxford – AI and Astrophysics |
|
|
10.50 - 11.20 |
Coffee Break |
Mezzanine |
|
11.20 - 12.30 |
Lighting Talks by Researchers |
Lecture rooms |
|
12.30 - 14.00 |
Lunch, networking and posters |
Mezzanine |
|
14.00 - 14.20 |
Professor Gemma Derrick Centre for Higher Education Transformations, University of Bristol – AI, Ethics & Research Culture |
Plenary: Lecture Theatre |
|
14.20 - 15.05 |
Parallel Sessions:
|
Lecture theatre/seminar rooms |
|
15.05 – 15.30 |
Coffee break |
Mezzanine |
|
15.30 - 16.30 |
|
Plenary: Lecture Theatre |
Take a look at the themes and speaker topics to decide which lightning talk you would like to attend.
Session 1: Theme - Data Science & Quantum Technologies I; Methods & Modelling
Session 2: Theme - Data Science & Quantum Technologies II; Governance, Safety & Assurance
Session 3: Theme - AI, Creativity & Human-Centred Computing
Session 4: Theme - Health & Biomedical AI
Session 5: Theme - Food Security, Biodiversity & Evolution
Session 6: Theme - Climate, Materials & Space
Please upload your presentation slides through this lightning talk slides submission form by Tuesday, 6th January.
Take a look at the speaker abstracts to decide which parallel session you would like to attend. Choose on the day.
Poster Guidelines for Presenters
Format and Mounting
- Size & orientation: A0 portrait (841 mm × 1189 mm)
- Mounting: Each board will have 4 pins; you may bring sticky Velcro dots if preferred
Essential Content
- Title & authors: Include all contributor names and affiliations
- Problem or objective: What you studied and why it matters
- Approach: A brief, high-level explanation of methods or framework
- Key findings: Prioritise visualisation over dense text
- Takeaway message: What an attendee should remember
Readability and Design
- Font size: Large enough to read from ~1-2 metres (title ~100 pt; text ≥28 pt)
- Layout: Use clear sections and logical flow
- Colour: 2-3 colours maximum; ensure strong contrast for accessibility
- Graphics: Use high-resolution figures; avoid clutter
- White space: Leave ample margins so the poster is not visually dense
Communicating Across Disciplines
- Audience: Aim to make your poster accessible to non-specialists
- Clarity: Use concise, straightforward language
- Jargon: Minimise specialised terms and define them when necessary
- Relevance: Highlight the broader significance of your work for audiences outside your discipline
During the Poster Session
- Prepare a short 1-2 minute overview of your project
- Be ready to discuss your methods at different levels of technical detail
- Consider adding a QR code to share supplementary materials
People’s Choice Poster Award
A People’s Choice Poster Award will be determined via online voting, with posters available on the conference website in advance. Please ensure your poster is clear, engaging and accessible to readers who may review it ahead of the event.
Click here to download a PDF of the guidelines.
Please upload a PDF version of your poster via this poster submission form by Tuesday, 6th January.



© Ian Wallman