Digital Surgery introduces mixed reality

Virtual reality offering a new dimension to surgery

Image credit: Osso VR

From gaming technology to the operating table, virtual reality and augmented reality are becoming valuable tools in surgical procedures and healthcare.

When delivering bad news, you may not expect a neurosurgeon to ask you to wear a headset and grab a gaming controller – but doing so is proving an effective part of pre-surgical consultations. Virtual reality (VR) is easing anxiety for patients facing neurosurgery.

Placed in a virtual world of 3D, computer-generated graphics, neurosurgeons can take a ‘tour’ of an individual’s brain, using the headset to view a 3D representation, compiled from computerised tomography (CT) scans, of X-ray images. The consultant can explain what they are seeing as they explore the VR model.  

Part of that reassurance is that the VR technology has been used earlier by the neurosurgeon, to plan and prepare for the surgery. A typical MRI (magnetic resonance imaging) scan can only show a 2D ‘slice’ of the brain but not show the location of blood vessels, for example. As this differs with each individual, it is vital to know where they are when planning the initial incision.

“This is easy to visualise using VR,” says Abdul Hamid Halabi, global business development lead for health and life sciences at Nvidia, which designs graphics processing units (GPUs) and chips used in gaming and imaging systems.

As well as putting patients at ease before an operation and helping surgeons plan the most efficient surgical procedure, VR is being used to train surgeons.

Several companies have developed VR software as a medical training tool. One is Osso VR. The company’s eponymous product enables registrars, medical students and surgeons to familiarise themselves with a typical, straightforward procedure efficiently until it becomes innate.

Using a commercially available Oculus Rift headset and two controllers to track hand motions, students can simulate and accelerate parts of a procedure to familiarise themselves with new techniques. The system can simulate and accelerate parts of a procedure, such as a knee replacement. Students can also set time aside for 20 to 30 minutes of VR training, sometimes with multiple ‘scenarios’ (procedures), says co-founder and CEO Dr Justin Barad.

Osso VR uses a library of images, some created in-house and some that are reconstructions of case studies.

“VR can help surgeons learn common procedures and then throw in some anomalies, for example a dysmorphic sacrum, to practise on a challenging anatomy,” explains Barad, who originally wanted to be a game developer before entering medicine.

In a validation study, where one group received traditional training and another had 15 to 20 minutes training using Osso VR, the VR group performed better in tests that measured skills such as handling of instruments, the flow of operation and knowledge of the specific procedure.

A report from the World Health Organization found that “the primary reasons for adverse events with new technologies are the result of improper training and long learning curves”. However, conventional off-site training is time-​consuming and expensive, so being able to train using new equipment or techniques will encourage innovation as well as improve skills.

Barad, a qualified orthopaedic surgeon, saw that many resident doctors, despite studying at university for years, still could not operate because new techniques and procedures are continuously being introduced without enough time to learn them. “VR allows you to get your hands into a simulation to walk through the surgical procedure in team training,” he enthuses.

Studies show that VR’s immersive training improves performance by 230 per cent compared with students who use text and video-based learning, he adds.

System choices

Pick your reality

Virtual reality is a 3D, computer-generated environment. Using a headset or gloves fitted with sensors, the user can see, touch and interact with their surroundings. VR primarily uses sight and sound and increasingly touch, with haptic systems.

To view virtual objects from all angles frame refresh rates should be a minimum of 30 frames per second. Many early VR systems had a refresh rate of 60 frames per second, but the delay led to motion sickness. For example, when standing up, the virtual environment cannot track the changes quickly enough. Clive Brooks, Micro-LEDs applications director at Plessey describes this ‘smear’ effect, causing the user to lose their balance, as “the scenery wobbling”.

Augmented reality adds virtual objects, sounds and text to the real-world environment and allows the user to interact with an object or, in the case of medical AR, a patient. ‘Pokémon Go’ is an example of AR. More complex than the TV-based VR system, AR systems can incorporate small cameras to feed signals to the software to adjust the field of view, or observable area. This takes a lot of processing power.

Google Glass was an early example of AR, and last year Vuzix introduced Vuzix Blade, a pair of AR smart glasses with a ‘floating’ screen to add extra information. With a floating screen there is no need to hold a mobile phone when using the smart glasses.

The company has ditched OLEDs (organic light emitting diodes) in favour of micro-LEDs from Plessey Semiconductor which increase resolution, contrast and luminance compared with OLEDs and consume half the power of OLEDs. This allows the power source to be located comfortably inside the frames.

Mixed reality also blends the physical and virtual worlds but uses both real and virtual objects in a real environment, for example using a real operating theatre but a virtual patient.

Some surgical teams are using VR ahead of a procedure to familiarise themselves with the operation and with each other’s working practices. Typically, there are six or more people in an operating theatre and they will often not have had the chance to work together beforehand. VR allows the assembled team to run through a simulation quickly; Barad describes it as “a refresher – to warm up – in VR”.

Specialist teams may not meet at all but liaise from different locations, and this can also be achieved using VR.

Digital Surgery’s training product, Touch Surgery, uses augmented reality (AR) and VR with simulations of surgical procedures. It is a Microsoft Mixed Reality Partner (MMRP) and, as part of that programme, delivers content that system integrators and developers will use in VR systems.

“Using mixed reality, we use a real operating theatre but a virtual patient,” explains Digital Surgery’s co-founder and CEO, Dr Jean Nehme.

Last year, TouchSurgery announced that it will use Microsoft’s HoloLens headset and artificial intelligence (AI) with a database of digital surgical processes to train a computer to understand procedures and predict the next steps in the operating room.

TouchSurgery software is currently used to prepare and train surgeons and test procedures via PCs and mobile phones. 

Nvidia’s Halabi says AR is emerging for medical use, building on the use of VR and robotic surgery. He believes the first use cases will be for AR to apply a scan on top of the patient during surgery. “Switching in and out of VR is not practical during surgery – I see more applications of AR over VR,” he explains.

Robotic surgery is popular as it is minimally invasive and is an early example of the use of AR in the operating theatre. Miniaturised instruments on the end of robotic arms can be operated and controlled by the surgeon, watching magnified images from a 3D camera. The surgeon controls three or more robotic arms via a console in the operating theatre.

The image viewed is that which the surgeon would see if standing at the operating table looking down on the patient, but AR allows data to be viewed over this image to augment what the surgeon sees. To progress development, Intuitive Surgical, which makes the Da Vinci robotic system, has announced funding grants for 2019 to support research into surgical robotics, enhanced visualisation or rendering and AR.

Canadian start-up Aris MD is developing an AR and VR system to process 2D scan images and convert them to 3D images, which are overlaid on the patient during surgery. Using diagnostic images, from MRI, CT, X-ray or ultrasound scans, the technology provides a map of the patient’s anatomy and injury. Such a ‘map’ can save time in the operating theatre and reduce the time the patient is under anaesthetic.

VR library

Creating a ‘Flight simulator’ for brain surgeons

Brain surgeons at St Bartholomew’s Hospital, London, have launched project Brainbook, a library of 360-degree VR films of brain surgery. The first VR film went online in September 2017 and shows a patient undergoing surgery for a brain aneurysm.

A combination of 360-degree cameras show the operating theatre as if the viewer is the patient being wheeled in to surgery. Head-mounted HD cameras show the operation from the view of the surgeon.  

The medical team worked with FundamentalVR, a UK-based VR simulation company that specialises in clinical training. Its Fundamental Surgery software as a service (SaaS) is described as a flight simulator for surgery. It combines VR images of tissue with haptic feedback for orthopaedic surgeons.

With the procedure recorded in this way, FundamentalVR can recreate the surgery in a VR simulation. Co-founder Chris Scattergood says: “Trainee surgeons can practise the key stages of the procedure in a safe virtual operating room and actually feel in their hands the textures of all the different tissue types using FeelReal VR [for haptic feedback].”

The system works with HTC Vive and Oculus Rift headsets and a PC or tablet.

Fundamental Surgery hopes to extend simulations into laparoscopic, general surgery, cardiovascular and otolaryngology, or ear, nose and throat (ENT) surgery.

The next step is for remote surgery, with teams of surgeons located around the world working together on the same procedure. Already, a consultant can use an AR platform such as Proximie to indicate with a pen where to make an incision or highlight an area to a surgical team in the theatre without leaving his or her desk.

“The technology and ability to have the surgeon sit in the next room is there,” says Halabi. “But the challenge for remote surgery is turnaround time,” he acknowledges. “If a surgeon makes a move, you need to be able to show it in about 70 milliseconds. That is doable if you have a solid internet connection – it has been done remotely across the globe, using robotic surgery,” he adds.

Professor Shafi Ahmed is colorectal surgeon at St Bartholomew’s Hospital (Barts) in London and has earned the nickname ‘the virtual surgeon’ as a result of his work pioneering virtual technology for medical collaboration.

In October 2017, he operated on a patient to remove bowel cancer, and was joined, in avatar form, by Professor Shailesh Shrikhande, a cancer surgeon at Tata Memorial Hospital in Mumbai, India, and Hitesh Patel, consultant colorectal surgeon at the London Independent Hospital.

Each surgeon wore a HoloLens headset powering the Aetho Thrive app, which put 3D avatars into the operating theatre. Via these avatars, each specialist could point to 3D holograms of the tumour, created from the patient’s scans, to assist or study the operation.

The technology offers the potential for specialists to guide or participate in operations even if they are called away to another location or in the case of emergencies where their specialist skills may be required, even though they are not able to be physically present.

Software and hardware companies are working to use VR and AR in all areas of healthcare. It is estimated that around 400 healthcare VR start-ups raised $4bn in funding in 2017, up from approximately $2.5bn the year before. Physicians are keen to adopt the technology, with VR and AR centres in hospitals experimenting and adapting specialist systems.

There are limitations today, with VR headsets only able to be used for short periods of time before they become heavy or uncomfortably hot. It is unlikely that headsets in their present form will be used, but it is possible that hand tracking could replace controllers in the operating theatre. Although exoskeleton and haptic gloves are available today, there is a time lag between manoeuvre and execution and they are not precise enough yet for surgical work.

Despite these limitations, industry insiders like Jean Nehme believe that the hardware and software issues will be resolved to bring VR and AR into the operating room within two years.


Picture perfect

Researchers at the University of Basel, Switzerland, are developing ways to use VR and AR for minimally invasive robotic bone surgery.

The university’s MIRACLE (Minimally Invasive Robot-Assisted Computer-guided LaserosteotomE) project is developing a robotic endoscope using laser light, which enables precise and small cuts and contact-free minimally invasive surgery for orthopaedics, otolaryngology, traumatology, spinal column, neuro- and cranio-maxillofacial surgery.  

The planning and navigation team is developing a real-time system to pinpoint the position of the laser for the incision. Today’s passive and optical markers do not have the required line-of-sight to monitor the laser tip position.

While visualisation of CT data is not new, Bàlazs Faludi and Marek Zelechowski are leading the team’s project to accelerate visualisation by taking a ‘raw’ CT image and construct a 3D image from a dataset of 2D scan images.

Current visualisation techniques segment the CT image data to create a 3D mesh object. This stage can be eliminated using commercially available graphics cards and the project’s custom software.

Voxels (the equivalent of pixels in 2D images) are used to synthesise the DICOM (Digital Imaging and Communications in Medicine) file image. No pre-processing or segmentation (manual or automatic) is required. “A couple of seconds after the scan is saved, we can visualise it in VR,” says Faludi.

Algorithms optimise the VR rendering for 90 frames per second performance and set colour values to render tissue, blood vessels and bone.  

Part of the HTC Vive Tracker development program, the system – called SpectoVive – uses HTC Vive glasses. The imaging objects can be scaled, moved and cut in real time, for use in planning, diagnostics and teaching.

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles