Supercomputers: getting inside invasive surgery
Increased access to high-performance computing power means physicians and technologists can reshape the frontiers of surgical procedure.
The pervasion of IT in medical applications has been steady over the last 40 years, but within the operating theatre it has encountered some profound barriers to entry. The use of computer-assisted technology to aid or augment human procedures has long been a theoretical possibility, but the massive back-end processing requirements have not been available with the flexibility, versatility, and affordability.
Supercomputers - the performance vehicles of the computer hardware world - have been around for decades, but the cost and complexity of these highly sophisticated platforms, engineered for applications where the highest processing speeds were an absolute prerequisite, precluded them from all but the most specialised and well-funded disciplines, such as meteorological modelling.
The emergence in recent years of easier access to supercomputing resources - and even so-called ‘personal supercomputers’ - is changing the situation. Computation in surgery has two distinct applications that can be categorised as extending the eyes and the hands of the surgeon, to reach otherwise inaccessible parts of the body, and perform operations remotely in the field. There is also a fast-emerging third area in pre-operative planning, where high-performance computing (HPC) is having a growing impact either by modelling possible courses of treatment, or in some cases calculating the shape of implants (see ‘Replacing The Parts’, p50).
Robotics comes into the category of extending the surgeon’s hands, although usually also involving imaging of some form to extend the eyes as well - but at present still under some form of master-slave relationship with the surgeon controlling the movement via a console.
Robotic surgery was pioneered in the military sphere, with Nasa and the US Defence Department leading the way in the early 1990s with the development of the Da Vinci ‘tele-presence surgery’ system designed to operate on the battlefield with the surgeon safely remote in a hospital at home. Now a commercial system, Da Vinci comprises articulating instruments including cameras, with the surgeon viewing the field of operation through binoculars providing a 3D video image, controlling the system via a console. The role of computation is to modulate both the image and the surgeon’s instructions, with one of the advantages being that it to some extent captures the skill of the surgeon within the system and dampens out small errors and in particular hand tremors.
The idea of extended dexterity has been taken further in other systems with the development of tiny robots that are in effect just micro-instruments too small for surgeons to operate directly, particularly for procedures in parts of the body where the proximity of delicate tissues or systems leaves little margin for error. This is the case with neural surgery, and for the head and neck in general, with such tiny robots close to being ready for clinical use for in middle ear operations (see ‘Critical Criterion’, right).
Important though the role of IT is for controlling and navigating robots, this is not the cutting edge as far as HPC goes; however, HPC is closely involved in the imaging that makes the use of robots and novel small surgical instruments possible, as HP’s global HPC technology programme manager Frank Baetke points out: “3D rendering and image reconstruction from scans are by far the most demanding applications computationally,” he says, “compared to navigation and robotic control.”
Computer-aided imaging is having the strongest impact on existing procedures, particularly tumour surgery, where the challenge is to remove all malignant tissue in order to avoid recurrence of the cancer, while minimising collateral damage on healthy tissue. Even the process of taking biopsies (tissue samples) to test for cancer in a particular region can be dangerous, but the risks will soon be reduced through computer-enhanced imaging, using either CT (Computerised Tomography, i.e., 3D reconstruction from slices) or MRI (magnetic resonance imaging).
One such system has been developed by a German/Polish team. “Our research was to make biopsies safer with respect to neighbouring tissues,” says Matthias Helbig, an otolaryngologist (ear, nose, and throat) at the University Hospital of Frankfurt, involved in the project. “Though this system is not yet used in daily surgical work, it soon will be. It will be used initially for head and neck biopsies where accuracy is particularly critical to avoid damaging vital components.”
Another operation where great accuracy is needed for similar reasons lies in removal of shrapnel from injuries on the field of battle, requiring removal of all debris to avoid infection or damage to tissues, while leaving neighbouring structures intact. Israel has led the way here, developing technology that relies on computation to integrate two different imaging systems.
First, the location of the shrapnel is determined approximately in advance through a surgical navigation system which performs scanning before the operation. As the shrapnel can move slightly during the operation, the scans are updated by data generated by a metal detection system, enabling the exact real-time position of the debris to be superimposed on the background scan of the area of operation.
“Combining a metal detector probe and a surgical navigation system in this way significantly decreases the operative time and increases the surgeon’s confidence, especially where migration of the metal fragment occurs during searching and extracting,” says Rami Mosheiff, professor of Orthopaedic Surgery at the Hadassah-Hebrew University in Jerusalem.
The idea of integrating different imaging or visualisation systems leads to one of the potentially most exciting avenues opened up by HPC, called virtual endoscopy. Like flight simulation technology, this mimics the effect of navigating through a patient, creating a powerful new tool for training, planning of surgery, and demonstrating to patients what they are about to have done to them.
Most exciting of all is the potential for combining the virtual endoscopy with real-time imaging, enabling for example the location of important regions to avoid, or that must be removed such as a tumour, to be superimposed onto the live image.
“The real image is directly comparable with the virtual view,” says Florian Schulze, a researcher at the Medical Visualization Centre in Vienna, Austria, specialising in virtual endoscopy: “At the same time the virtual view can be augmented with additional information.” Schulze and colleagues have described a procedure where this additional information comprises virtual images of blood vessels and nerves as well as tumour tissue itself that would not show up directly on the real images.
The effect is to combine different sources of imaging information including visual and also deep-scanning data to create a much more comprehensive overall view providing surgeons with vital clinical and navigational information in real time. But this is still work in progress, with Schulze admitting there is still work to do ensuring the virtual view of the anatomy is kept totally synchronised with the real one as details change, for example as the result of swelling caused by the surgery itself.
The tumour taker
A glimpse of how the surgical procedure of the future might operate came in April 2008, when a supercomputer-based system at the Texas Advanced Computing Center (TACC) in Austin destroyed prostate cancer tissue in a dog by directing lasers externally, dispensing not just with the surgeons but their traditional tools as well. In this case the supercomputer used was TACC’s Lonestar, a Dell Linux Cluster with 11.6 Terabytes of memory and 5,840 processor cores (within 1460 Dell PowerEdge blades, 16 PowerEdge 1850 compute-I/O server-nodes, and two PowerEdge 2950 login/management nodes), many of which were occupied by this procedure analysing location information obtained by thermal imaging to direct the lasers accurately; the TACC Lonestar has a peak performance speed of 62 Teraflops (Floating point Operations Per Second).
This could be applied to a substantial number of procedures, including many types of cancer treatment, where tissue has to be removed or destroyed, which can be done via external focusing of laser, ultrasound, or other types of radiation, controlled in response to feedback from various imaging systems. The process involved a pre-procedure phase that also makes great use of the supercomputer’s powers, reports TACC science and technology writer Aaron Dubrow: “Several days before the surgery, the patient received an initial MRI that provides the topography of the medical region of interest).
Using the data from the MRI and software available at TACC and ICES, a hexahedral mesh representing the biological domain as a 3D model is created and laser parameter pre-optimisation begins. In cancer treatment, optimisation means more than just determining where to point the laser and for how long. Doing maximum damage to the tumour must be balanced with protecting healthy tissue, while simultaneously minimising heat-shock proteins, whose expression can prevent tumour eradication.
The treatment itself is in four stages:
1. Lonestar instructs the laser to heat the domain with a non-damaging calibration pulse.
2. the thermal MRI acquires baseline images of the heating and cooling of the patient’s tissue for model calibration.
3. Lonestar inputs this patient-specific information and re-computes the optimal power profile for the rest of the treatments;
4. Surgery begins, with remote visualisations and evolving predictions continuing throughout the procedure.
“We had a 15-minute window in which a million things had to go right for this treatment to be successful,” David Fuentes, post-doctoral student at the University of Texas at Austin’s Institute for Computational Engineering and Sciences (ICES), and central developer of the project, told ZDNet. “There had to be no flaw, no silly bug, everything had to go perfectly.”
This is a highly computationally-intensive process, with many complex application stages performing in real time, and demanding as much processing power as is available, with absolutely no margin for error or latency droops. In this case the canine patient died, but the operation was still judged a success because it proved the principle even if improvements in outcome are required before such procedures are applied to humans.
The procedure demonstrated not just the application of computational power, but also accompanying infrastructure and software designed to maximise availability and safety (see ‘Critical Criterion’, p48).
HPC-based imaging will also be used for more conventional procedures involving instruments or robotic devices, where one of its primary roles will be to increase accuracy of navigation. Memory and storage performance are critical for such applications, and IBM has promoted its System x mainframes and Power System servers here, along with its Bladecentre servers integrated with the General Parallel File System, which is suitable for accessing huge unstructured data sets.
Similar high performance is required for some forms of orthopaedic surgery, where the power is needed to construct accurate models for implants. Another emerging field for HPC is in modelling blood flow, which is required for best results in kidney dialysis. Although a long established procedure, kidney dialysis benefits from HPC through calculations of blood flow to optimise the size and positioning of the grafts that connect up the patient’s blood circulation to the machine.
Multiple Instruction stream, Multiple Data stream - a technique employed to achieve parallelism. Machines using MIMD have a number of processors that function asynchronously and independently.
Single instruction, multiple data - a class of parallel computers in Flynn’s taxonomy classification of computer architectures. Describes computers with multiple processing elements that perform the same operation on multiple data simultaneously.
Multiprocessor computer hardware architecture where two or more identical processors are connected to a single shared main memory.
Non-uniform memory access
Computer memory design used in multiprocessors, where the memory access time depends on the memory location relative to a processor. A processor under NUMA can access its own local memory faster than non-local memory.
Natural orifice transluminal endoscopic surgery - an experimental surgical technique whereby ‘scarless’ abdominal operations can be performed with an endoscope passed through a natural bodily orifice, then through an internal incision in the stomach, vagina, bladder, or colon.
The critical criterion
While ICT has the potential to make surgery safer, for example by computerising best-practice, there are also potential risks resulting from system failure or software bugs.
To an extent we have been here before, over two decades ago, with so-called ‘fly-by-wire’ systems, when many people for the first time depended for their lives on computer systems working properly. Many of the key principles established then, such as redundant paths through software, and avoidance of single (or even dual) points of failure in hardware, can in principle be applied to ICT systems in the surgery, particularly to control robotic instruments. A good starting point for the software development is to follow Albert Einstein’s famous maxim, to make the system ‘as simple as possible, but no simpler’.
“We tried to reduce the complexity of the software to a minimum, for safety reasons, yes,” says Thomas Maier from the Technical University of Munich, and co-developer of a new micro-manipulator for middle ear surgery, which is particularly challenging because of the small, delicate, and critical nature of the structural components there, amplifying the impact of any mistake of hand tremor on the part of the surgeon.
The same principle applies to hardware, with Maier and colleagues implementing real-time functions in a single microcontroller to reduce the overall number of electronic components, and then developing ‘watchdog’ software to make sure the microcontroller itself operates only within specified safety parameters.
Indeed, computation must play a role in its own ‘policing’ in surgery, just as in the fly-by-wire technique, by preventing surgical tools doing anything too hazardous in any event. This has led to the idea of the ‘safety kernel’, which would cut in for example if a robotic instrument deep within the body stepped out of its allowed range of positions or other operating parameters. This kernel is a layer between the application software and the instruments it controls, imposing well defined constraints, recognising that it is impossible to be sure all bugs have been rooted out, or all exceptions catered for. But there is a way to go yet before computer driven robotic systems will be trusted entirely unaided for even straightforward procedures.
Replacing the parts that old surgery couldn't reach
IT has recently started to open up some radically new treatments, often involving high-performance computing (HPC) for imaging or modelling. In some cases the new applications are live in the surgery, involving real-time imaging feedback either from instruments inside the patient or scanning systems nearby. In other cases the computationally intensive part of the procedure takes place in advance during planning, or occasionally in design of parts or implants used during the surgery.
The latter is happening for advanced hip-replacement surgery in treating complex bone tumours where previously complete amputation of the whole joint and limb beneath was the only way to save the patient. Now it is possible instead to replace the excised bone and other tissue with a titanium implant shaped exactly to fit the excavated area. First, a scan of the area is made using computer tomography, enabling a 3D representation of the required implant to be constructed, which is then submitted to a tooling centre to make the part. Computation is also required to insert the implant correctly and ensure it moves optimally afterwards, according to Rainer Burgkart, one of the pioneers of this procedure at the Technical University of Munich in Germany.
“Computation overcomes the major difficulty we had in getting the perfect position of the acetabulum (pelvis, comprising a concave surface where the ball of the femur sits to form the rotatable hip joint),” Burgkart explains, adding he was also applying the technology to knee reconstruction after tumour removal there. In that case the system also helps align the new joint properly without which it would cause pain and arthritic problems.
Another major frontier being breached largely because of advances in computation and imaging is ‘scarless surgery’, known as NOTES (Natural Orifice Transluminal Endoscopic Surgery), in which the instruments are inserted through one of the body’s openings. In fact scars still result, but only internally, and these are usually only minimal if the operation is done properly. NOTES in effect combines two traditional techniques, endoscopy, in which instruments are inserted through an orifice, and laparoscopy, otherwise known as keyhole surgery, where operations are performed with minimal cutting. With NOTES, the keyhole cut is made internally rather than on the surface, but this means that the endoscopic imaging must now be much more accurate, enabling the surgeon to manipulate instruments either remotely via a robotic control system, or directly but often without line of sight.
In some cases the images will be obtained optically by instruments inserted alongside the surgical tools, and in other cases via an external scanning system such as ultrasound - or different systems may be combined.
|To start a discussion topic about this article, please log in or register.|
"Summer is on the way, so we turn our attention to a few leisurely pursuits - and some not-so leisurely ones..."
- Greenpeace frowns at Centrica's getting a shale-gas venture stake
- World’s most advanced comms satellite shipped to launch site
- HMS Queen Elizabeth nears completion
- Scientist to benefit from exascale supercomputer deal
- Dinosaurs’ app uses augmented reality
- Chinese space capsule reaches its ‘Heavenly Palace’
- E&T magazine - Debate - HS2, the need for speed [01:33 pm 18/06/13]
- Creating an Iphone App [05:50 pm 17/06/13]
- CO2 is good [07:29 pm 16/06/13]
- DECC-EDF makes yet another attempt to fund 3rd Generation Nuclear at any cost [05:02 pm 15/06/13]
- Transformers Vector Group [09:46 am 15/06/13]
Tune into our latest podcast