"Never doubt that a small group of thoughtful, committed citizens can change the world. Indeed, it is the only thing that ever has."

Margaret Mead
Technical report
peer-reviewed

Surgeon at a Workstation: Information Age Surgery



Abstract

A new model of surgery is evolving to take full advantage of the improving technology available in the operating room. Surgeons have an increasing wealth of data available to them, but currently cannot effectively access that information during surgery. The next stage in Information Age surgery will see the relocation of the surgeon to a workstation from which data can be accessed and manipulated without interrupting surgical procedure. This workstation will allow the surgeon to control a surgical robotic system with audio, visual, and haptic sensory capabilities in order to replicate, and eventually augment, the sounds, sights, and tactile sensations of surgery. The potential benefits of a workstation include case rehearsal, individualized case profiles, real-time imaging, and immediate connectivity to a network of global medical data, resulting in greater efficiency and better outcomes. Surgical robotic systems are currently being developed to provide surgeons with these desirable features, ideal for the treatment of brain tumors. NeuroArm, a robotic technology developed at the University of Calgary, Canada, includes a sensory immersive workstation and has now been applied to over 30 neurosurgical patients.

Introduction

Surgeon at a workstation: information age surgery

We live in a world of information.  Technology allows data to accumulate faster than ever before. In medicine, this Information Age has resulted in the increasing need for data manipulation during the planning and performance of surgery [1].  The scope of Information Age surgery includes the growing field of surgical robotics, which is leading towards the integration of the multiple technologies present in the operating room into a single console. Human machine interfaces (HMI) have been present in the operating room since the appearance of electrocautery in 1927 at the Brigham and Women’s Hospital, Harvard.  Many other technologies have converged into the surgical suite, including microscopes, endoscopes, surgical navigation, computerized tomography (CT) scanners, magnetic resonance (MR) imaging devices, ultrasonic aspirators, and electrophysiological monitoring with electrode placement.  These devices produce an array of sophisticated data that, if optimally utilized, could greatly facilitate the performance and safety of surgery.  While not limited to, this could greatly augment the care of patients with brain tumors.  It is currently difficult for surgeons to operate the machines, navigate the surgical space, and perform surgery while taking full advantage of the data provided.  The next step in technological integration represents a fundamental shift in the way surgery will be performed: the surgeon will relocate away from the operative site [2].

A remote workstation will enable the surgeon to conduct an operation, while manipulating the technology present in the operating room and the data it produces (Figure 1).

For surgeons who have grown up in the Information Age, this workstation will provide a connection to digital knowledge that they have come to expect.  Immediate access to online information will allow surgeons to focus on retrieving knowledge, which will be retained in a database rather than in personal memory, much the way the calculator changed the focus of mathematics from the process of calculation to the meaning of the result.  This will translate the art of surgery towards a science capable of reduplication.  The workstation also makes case rehearsal possible.  Rehearsal increases the experience of each surgeon and provides the ability to practice seldom-used procedures.  In tandem with rehearsal, health care personnel could use the workstation to compile a personalized case card that outlines the requirements of a particular procedure, so that the specific instrument selection for each individual case can be prepared in advance.  This ensures that the correct tools, many of which are disposed after a single use, are in place for the procedure, optimizing resource use in this time of fiscal restraint.

The human brain has evolved to respond to an incomplete data set based on sensory information, a function that computers cannot yet replicate [3].  Human senses are interconnected; to remove one significantly alters perception, which in surgeons could result in a decrease in surgical performance.  Therefore, a robotic system should respond to the workings of the human brain, and the workstation should reproduce human senses as closely as possible, so that surgeons can continue to use their past experience to respond to subtle sensory cues.  While current technologies preclude the exact replication of the surgical environment at the workstation, advances in visual, audio, and haptic components indicate that eventually the sensory feedback produced will enhance human capabilities, so that surgeons will be able to see what cannot be seen, hear what cannot be heard, and touch what cannot be felt. 

Technical Report

Collaboration between science, engineering, and medicine has resulted in the production of neuroArm, a MR-compatible image-guided robot capable of both microsurgery and stereotaxy [4].  NeuroArm was developed by the University of Calgary (Calgary, Alberta, Canada) in conjunction with MacDonald, Dettwiler and Associates (Brampton, Ontario, Canada), the aerospace company that created the CanadArm (Space Arm) for NASA’s shuttle program and the Special Purpose Dexterous Manipulator for the International Space Station.  The HMI for neuroArm consists of a 3D MR display, a virtual robot display, cameras that observe the entire operative site, stereoscopic display through high-definition cameras projecting to a 3D monitor, and modified PHANTOM haptic hand controllers that provide translational forces in X, Y, and Z directions to the surgeon.  This HMI is operated from a remote workstation that is designed to maximize the surgeon’s ability to coordinate surgery with imaging data.  NeuroArm, together with its sensory immersive workstation, has now been successfully integrated into neurosurgical procedure with over 30 cases.  In an orchestrated fashion, the surgeon sited at the workstation was able to interact with the intraoperatively-acquired MR imaging data, effectively communicate with the surgical team and conduct microsurgery.

NeuroArm’s haptic feedback, while sufficient, is currently limited to 3 degrees of freedom (DOF), while the manipulators operate with 7 DOF.  Since the construction of neuroArm, emerging technologies are able to deliver force feedback in 7 DOF, indicating significant progression in the field of haptics, propelled by advances in electronics.  Alongside additional degrees of freedom, haptic feedback needs to develop further to allow the surgeon to feel hardness, temperature, and to fully recreate proprioception.  If force sensors can return these sensations to the surgeon, the use of surgical robots will become more intuitive. Once touch has been replicated, sensations could be enhanced beyond unaided human capabilities [5].  For instance, some elite surgeons can currently join vessels that are too small for most surgeons to feel.  If all surgeons are given the ability to feel those vessels through an augmented sense of touch, the procedure will become easier and standards of practice will increase.

Discussion

Robotics also may include no-go zones, incorporated from aerospace technology, to prevent manipulators from leaving the surgical corridor.  No-go zones significantly decrease the risk of uncontrolled tool movement.  These no-go zones represent a successful interdisciplinary approach to progressing the future of medicine.  Many individuals contribute to the global collaboration that creates technological progress.  However, there are still obstacles to the widespread adoption of robots in surgery. Robots are still less dexterous than the human hand.  Surgeons are faster than robots because of a delay between command and execution that can impede the ability of robots to respond to a changing operating site.  Both of these problems will be addressed by developments in technology; advances in dexterity are occurring rapidly alongside improvements in haptics, and the rate of information exchange between the HMI and the robot are becoming faster.  As HMI progresses, Information Age surgery will become an increasingly desirable model. Surgical workstations featuring instantaneous access to knowledge, an immersive sensory environment, and enhanced surgical performance will improve the practice of medicine.  

Conclusions

N/A


References

  1. Lang MJ, Sutherland GR: Informatic surgery: the union of surgeon and machine. World Neurosurg. 2010, 74:118-120. 10.1016/j.wneu.2010.04.005
  2. Marescaux J, Leroy J, Gagner M, et al.: Transatlantic robot-assisted telesurgery. Nature. 2001, 27:379-380 .
  3. Beck JM, Ma WJ, Kiani R, et al.: Probabilistic population codes for Bayesian decision making. Neuron. 2008, 60:1142-1152. 10.1016/j.neuron.2008.09.021
  4. Sutherland GR, Latour I, Greer AD, et al.: An image-guided magnetic resonance-compatible surgical robot. Neurosurgery. 2008, 62:286-292. 10.1227/01.neu.0000315996.73269.18
  5. Crowder R: Applied physics. Toward robots that can sense texture by touch. Science. 2006, 312:1478-1479.
Technical report
peer-reviewed

Surgeon at a Workstation: Information Age Surgery


Author Information

Garnette R. Sutherland Corresponding Author

Department of Neurosurgery, University of Calgary


Ethics Statement and Conflict of Interest Disclosures

Human subjects: Consent was obtained by all participants in this study. University of Calgary Conjoint Health Research Ethics Board of the faculties of Medicine, Nursing and Kinesiology together with Health Canada approval for Class IV testing issued approval # 210125. Animal subjects: All authors have confirmed that this study did not involve animal subjects or tissue. Conflicts of interest: In compliance with the ICMJE uniform disclosure form, all authors declare the following: Payment/services info: This work was supported by grants from the Canada Foundation for Innovation, Western Economic Diversification Canada and Alberta Advanced Education and Technology. Financial relationships: Garnette Sutherland declare(s) an alternate financial activity from IMRIS, Inc. Consultant. Intellectual property info: The author transferred rights of patents to and holds shares in IMRIS, Inc. that manufactures, sells and distributes intra-operative MRI and neuroArm technology worldwide. Multiple patents for both iMRI and neuroArm technology. The patents have been successfully transferred to IMRIS Inc. Minnetonka, MN, USA. Other relationships: University and investigators receive shares, which has been declared as above.


Technical report
peer-reviewed

Surgeon at a Workstation: Information Age Surgery


Figures etc.

SIQ
5.5
RATED BY 2 READERS
CONTRIBUTE RATING

Scholary Impact Quotient™ (SIQ™) is our unique post-publication peer review rating process. Learn more here.