AR system to engage patients with interactive models of themselves
Image credit: Birmingham City University
Researchers at Birmingham City University’s Digital Media Technology Lab have developed an augmented reality system that lets clinicians interact with virtual models of body parts and patient data.
The system displays organs, bones and other body parts, and allows interaction with a simple swipe of the hand.
“We are developing this system as a platform to allow medical professionals to interact with genuine patient data and manipulate it by hand to educate and inform patients,” said Professor Ian Williams, subject lead for image and video technology in the Digital Media Technology Lab.
“The real advantages this brings are being able to visually demonstrate parts of the anatomy, using virtual models which can be customised for each patient and show how they have been impacted by lifestyle choices, or how they may be changed following treatments or surgery.”
The systems allows users to navigate and manipulate 3D images and patient data using freehand motions and gestures. This could be used to demonstrate medical problems and procedures to patients, as well as showing the effects of lifestyle choices – such as smoking – and various treatments on specific patients, using their medical records.
The AR system uses motion-detection sensors and integrates expertise in freehand interaction in mixed reality in order to create a reliable, user-friendly experience.
AR and virtual reality (VR) are emerging as key technologies in healthcare, particularly in training clinicians to perform surgery, and demonstrating procedures to patients. VR experiences are also being trialed to help patients relax under stressful conditions, such as dental procedures, to introduce parents-to-be to their unborn babies, and to recreate traumatic experiences for people suffering from post-traumatic stress disorder in a safe, controlled way.
As the Birmingham City researchers continue to develop the AR system, they hope to upgrade it to replicate injuries, illnesses and disabilities, and to demonstrate how treatments or lifestyle choices could improve the patient’s quality of life with customisable medical models. They hope that this could boost the engagement of patients with their treatment choices.
A future iteration of the system could also display patient data in an array of settings, and allow surgeons to manipulate visualisations of patients’ bodies during surgery without the need to remove their sterile gloves and scrubs.