Tuesday, August 2, 2011

Supercomputer Performs Laser Cancer Surgery

Increased access to high-performance computing power means physicians and technologists can reshape the frontiers of surgical procedure.

The pervasion of IT in medical applications has been steady over the last 40 years, but within the operating theatre it has encountered some profound barriers to entry. The use of computer-assisted technology to aid or augment human procedures has long been a theoretical possibility, but the massive back-end processing requirements have not been available with the flexibility, versatility, and affordability.Surgeon using laptop

Supercomputers - the performance vehicles of the computer hardware world - have been around for decades, but the cost and complexity of these highly sophisticated platforms, engineered for applications where the highest processing speeds were an absolute prerequisite, precluded them from all but the most specialised and well-funded disciplines, such as meteorological modelling.

The emergence in recent years of easier access to supercomputing resources - and even so-called ‘personal supercomputers’ - is changing the situation. Computation in surgery has two distinct applications that can be categorised as extending the eyes and the hands of the surgeon, to reach otherwise inaccessible parts of the body, and perform operations remotely in the field. There is also a fast-emerging third area in pre-operative planning, where high-performance computing (HPC) is having a growing impact either by modelling possible courses of treatment, or in some cases calculating the shape of implants .

Robotics comes into the category of extending the surgeon’s hands, although usually also involving imaging of some form to extend the eyes as well - but at present still under some form of master-slave relationship with the surgeon controlling the movement via a console.

Robotic surgery was pioneered in the military sphere, with Nasa and the US Defence Department leading the way in the early 1990s with the development of the Da Vinci ‘tele-presence surgery’ system designed to operate on the battlefield with the surgeon safely remote in a hospital at home. Now a commercial system, Da Vinci comprises articulating instruments including cameras, with the surgeon viewing the field of operation through binoculars providing a 3D video image, controlling the system via a console. The role of computation is to modulate both the image and the surgeon’s instructions, with one of the advantages being that it to some extent captures the skill of the surgeon within the system and dampens out small errors and in particular hand tremors.

The idea of extended dexterity has been taken further in other systems with the development of tiny robots that are in effect just micro-instruments too small for surgeons to operate directly, particularly for procedures in parts of the body where the proximity of delicate tissues or systems leaves little margin for error. This is the case with neural surgery, and for the head and neck in general, with such tiny robots close to being ready for clinical use for in middle ear operations.

Computer-aided imaging

Important though the role of IT is for controlling and navigating robots, this is not the cutting edge as far as HPC goes; however, HPC is closely involved in the imaging that makes the use of robots and novel small surgical instruments possible, as HP’s global HPC technology programme manager Frank Baetke points out: “3D rendering and image reconstruction from scans are by far the most demanding applications computationally,” he says, “compared to navigation and robotic control.”

Computer-aided imaging is having the strongest impact on existing procedures, particularly tumour surgery, where the challenge is to remove all malignant tissue in order to avoid recurrence of the cancer, while minimising collateral damage on healthy tissue. Even the process of taking biopsies (tissue samples) to test for cancer in a particular region can be dangerous, but the risks will soon be reduced through computer-enhanced imaging, using either CT (Computerised Tomography, i.e., 3D reconstruction from slices) or MRI (magnetic resonance imaging).

One such system has been developed by a German/Polish team. “Our research was to make biopsies safer with respect to neighbouring tissues,” says Matthias Helbig, an otolaryngologist (ear, nose, and throat) at the University Hospital of Frankfurt, involved in the project. “Though this system is not yet used in daily surgical work, it soon will be. It will be used initially for head and neck biopsies where accuracy is particularly critical to avoid damaging vital components.”

Another operation where great accuracy is needed for similar reasons lies in removal of shrapnel from injuries on the field of battle, requiring removal of all debris to avoid infection or damage to tissues, while leaving neighbouring structures intact. Israel has led the way here, developing technology that relies on computation to integrate two different imaging systems.

First, the location of the shrapnel is determined approximately in advance through a surgical navigation system which performs scanning before the operation. As the shrapnel can move slightly during the operation, the scans are updated by data generated by a metal detection system, enabling the exact real-time position of the debris to be superimposed on the background scan of the area of operation.

“Combining a metal detector probe and a surgical navigation system in this way significantly decreases the operative time and increases the surgeon’s confidence, especially where migration of the metal fragment occurs during searching and extracting,” says Rami Mosheiff, professor of Orthopaedic Surgery at the Hadassah-Hebrew University in Jerusalem.

The idea of integrating different imaging or visualisation systems leads to one of the potentially most exciting avenues opened up by HPC, called virtual endoscopy. Like flight simulation technology, this mimics the effect of navigating through a patient, creating a powerful new tool for training, planning of surgery, and demonstrating to patients what they are about to have done to them.

Most exciting of all is the potential for combining the virtual endoscopy with real-time imaging, enabling for example the location of important regions to avoid, or that must be removed such as a tumour, to be superimposed onto the live image.

“The real image is directly comparable with the virtual view,” says Florian Schulze, a researcher at the Medical Visualization Centre in Vienna, Austria, specialising in virtual endoscopy: “At the same time the virtual view can be augmented with additional information.” Schulze and colleagues have described a procedure where this additional information comprises virtual images of blood vessels and nerves as well as tumour tissue itself that would not show up directly on the real images.

The effect is to combine different sources of imaging information including visual and also deep-scanning data to create a much more comprehensive overall view providing surgeons with vital clinical and navigational information in real time. But this is still work in progress, with Schulze admitting there is still work to do ensuring the virtual view of the anatomy is kept totally synchronised with the real one as details change, for example as the result of swelling caused by the surgery itself.

Extreme multi-tasking

The tumour taker

A glimpse of how the surgical procedure of the future might operate came in April 2008, when a supercomputer-based system at the Texas Advanced Computing Center (TACC) in Austin destroyed prostate cancer tissue in a dog by directing lasers externally, dispensing not just with the surgeons but their traditional tools as well. In this case the supercomputer used was TACC’s Lonestar, a Dell Linux Cluster with 11.6 Terabytes of memory and 5,840 processor cores (within 1460 Dell PowerEdge blades, 16 PowerEdge 1850 compute-I/O server-nodes, and two PowerEdge 2950 login/management nodes), many of which were occupied by this procedure analysing location information obtained by thermal imaging to direct the lasers accurately; the TACC Lonestar has a peak performance speed of 62 Teraflops (Floating point Operations Per Second).

This could be applied to a substantial number of procedures, including many types of cancer treatment, where tissue has to be removed or destroyed, which can be done via external focusing of laser, ultrasound, or other types of radiation, controlled in response to feedback from various imaging systems. The process involved a pre-procedure phase that also makes great use of the supercomputer’s powers, reports TACC science and technology writer Aaron Dubrow: “Several days before the surgery, the patient received an initial MRI that provides the topography of the medical region of interest).

Using the data from the MRI and software available at TACC and ICES, a hexahedral mesh representing the biological domain as a 3D model is created and laser parameter pre-optimisation begins. In cancer treatment, optimisation means more than just determining where to point the laser and for how long. Doing maximum damage to the tumour must be balanced with protecting healthy tissue, while simultaneously minimising heat-shock proteins, whose expression can prevent tumour eradication.

https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhngURDrhbuO6kg5KTJ3OgGkylkI1CEwuIB9PgozrI76cK3kjeHL3u_UOG8FLuuDvN6ff-355qh9MCF3Y1NpeQ2snRi2Pg_KyXrTsL8jTtQepobKCpbXdURoTfPfPaQ5yjlQlkYAtbeFa8/s400/roadrunner_supercmputer.jpg

The treatment itself is in four stages:

1. Lonestar instructs the laser to heat the domain with a non-damaging calibration pulse.

2. the thermal MRI acquires baseline images of the heating and cooling of the patient’s tissue for model calibration.

3. Lonestar inputs this patient-specific information and re-computes the optimal power profile for the rest of the treatments;

4. Surgery begins, with remote visualisations and evolving predictions continuing throughout the procedure.

“We had a 15-minute window in which a million things had to go right for this treatment to be successful,” David Fuentes, post-doctoral student at the University of Texas at Austin’s Institute for Computational Engineering and Sciences (ICES), and central developer of the project, told ZDNet. “There had to be no flaw, no silly bug, everything had to go perfectly.”

This is a highly computationally-intensive process, with many complex application stages performing in real time, and demanding as much processing power as is available, with absolutely no margin for error or latency droops. In this case the canine patient died, but the operation was still judged a success because it proved the principle even if improvements in outcome are required before such procedures are applied to humans.

The procedure demonstrated not just the application of computational power, but also accompanying infrastructure and software designed to maximise availability and safety.

HPC-based imaging will also be used for more conventional procedures involving instruments or robotic devices, where one of its primary roles will be to increase accuracy of navigation. Memory and storage performance are critical for such applications, and IBM has promoted its System x mainframes and Power System servers here, along with its Bladecentre servers integrated with the General Parallel File System, which is suitable for accessing huge unstructured data sets.

Similar high performance is required for some forms of orthopaedic surgery, where the power is needed to construct accurate models for implants. Another emerging field for HPC is in modelling blood flow, which is required for best results in kidney dialysis. Although a long established procedure, kidney dialysis benefits from HPC through calculations of blood flow to optimise the size and positioning of the grafts that connect up the patient’s blood circulation to the machine.

0 comments:

Post a Comment

 
Design by sudhanshu. Converted To Web By SUDHANSHU RATNA THAKUR