A patient’s eye view of the future of surgery: inside a community engagement science fair

The Science of Surgery is the UCL Hawkes Institute’s flagship community event, taking place at Charles Bell House in London. Our OASIS Hub researchers shared their cutting edge work with over 500 visitors.

We invited Shelly Pomeroy, a Patient Advisory Group member, to act as roving reporter and share her experience of the day.

On Friday April 10th, I attended The Science of Surgery at Charles Bell House in London, one of the homes of the UCL Hawkes Institute. It was an event to showcase the work of researchers at the Institute. As I wandered around the stalls, enjoying the dynamic Science Fair atmosphere, children and adults alike were trying out surgical tools, learning about the latest advancements in computer software and completing the many hands-on illustrative challenges set up by the scientists and engineers. It didn’t take long for me to roll up my sleeves and get involved. 

The sheer levels of innovation and excitement rose by the hour like an approaching electrical storm. In amongst the crackle of brainpower, I noticed several themes emerging of how the future of surgery is starting to look.

Improving the tools

Child engaging with laparoscopic surgery activity

Child engaging with laparoscopic surgery activity

The surgical world currently works hard to prioritise minimally invasive procedures - laparoscopic instruments being the most recognisable advancement in recent times. This primarily deals with small, neat entry points, however once inside the body, surgeons are often still using outdated manual tools such as graspers, spades and forceps which can cause a disproportionate amount of complications and post-operative suffering. This seems unnecessary in 2026.

Several stalls were focused on this conundrum, one working on designing a tent-like structure that once inflated inside the body could gently push unaffected organs to one side, keep them out of harm’s way and create an area for surgical instruments to move and mend afflicted parts of the body without damaging perfectly healthy neighbours.

Along with mitigating collateral damage, several projects are focusing on the ability to move around inside the body with more accuracy. This can mean increasing dexterity by using robotics to include things like swivel-actions and moving around corners, vastly improving the aforementioned manual tool options.

I visited multiple displays explaining how researchers are developing computer software that transmits real-time images and annotations giving surgeons increasingly accurate, on the spot information not visible to the naked eye. Another was looking at removing barriers, in particular using software to reduce specular highlights on images which might be best understood as 'anti-glare' technology.

A particularly interactive stall deepened my understanding of the challenges of depth and space perception in surgery. The researchers in question are working on AI technology helping to bridge the gap between what the surgeon can see and the action of the hands.

Researchers showcasing project

Researchers showcasing project

In terms of progress happening outside of the operating theatre, there are exciting advancements here too.

Improving scans and interpretation

Researcher showcasing project

Researcher showcasing project

At the moment, medical scans generally produce a 'still image' that will be examined and interpreted by a trained professional, and quite often more than one. This is time-consuming and subject to human interpretation.

I visited several stalls that showed advancements using Deep Learning and Augmented Reality to enhance CT, Ultrasound and MRI scans to a point where they can give real-time interpretation of their content, thus raising possible red flags; identifying what is healthy and what isn’t, removing the risk of bias, predicting trajectories of things like tumours and highlighting risks of recurrence. All of this can currently take several years to play out as a series of conversations and explorations involving patients and their medical team. And so often opportunities for early intervention are subsequently missed.

My own diagnosis and treatment experience bears testament to this current state of play. It was wonderful to see something that could so directly improve the patient experience.

Another stall illustrated the difficulty of manually identifying embryo developmental stages from 2D morphology images, highlighting the motivation for using AI models to assist embryo selection for IVF.  This could potentially reduce the current rollercoaster that is multiple rounds of IVF making the whole process more effective.

One particularly innovative display showed a breathing sensor which counted how many times you breathe per minute - currently a job done by a rather inaccurate finger clip (if you've been in hospital, you'll have met this little gadget all too often). By comparison, this is a tiny, single fibre counting the breaths continuously whilst simultaneously talking to the PC. It is bio-derived - made from textiles and sustainable to boot!

Researcher showcasing breathing sensor

Researcher showcasing breathing sensor

Using Virtual and Augmented Reality

The burgeoning use of Virtual Reality is also very exciting for the world of surgery and surgical training. I visited stalls where simulation software was on display helping trainee surgeons further develop their skills before operating on patients.  

On our tour of the basement laboratories I had my first ever go with a VR headset. It helped to enhance an ultrasound image and definitively showed how technology can sit comfortably alongside human experience and expertise.

Child engaging with obstetric ultrasound activity

Child engaging with obstetric ultrasound activity

I found it to be a fascinating and utterly absorbing day. I left feeling much more knowledgeable and optimistic about big concepts such as Artificial Intelligence and the future of Surgery, having seen it in action.

Congratulations to everyone involved!

Next
Next

NVIDIA global robotics team sees OASIS Hub research in action