"Never doubt that a small group of thoughtful, committed citizens can change the world. Indeed, it is the only thing that ever has."

Margaret Mead

Original article

Augmented Reality in Spinal Surgery: Highlights From Augmented Reality Lectures at the Emerging Technologies Annual Meetings



Augmented reality (AR) is an advanced technology and emerging field that has been adopted into spine surgery to enhance care and outcomes. AR superimposes a three-dimensional computer-generated image over the normal anatomy of interest in order to facilitate visualization of deep structures without the ability to directly see them.


To summarize the latest literature and highlight AR from the annual “Spinal Navigation, Emerging Technologies and Systems Integration” meeting lectures presented by the Seattle Science Foundation (SSF) on the development and use of augmented reality in spinal surgery. 


 We performed a comprehensive literature review from 2016 to 2020 on PubMed to correlate with lectures given at the annual “Emerging Technologies” conferences. After the exclusion of papers that concerned non-spine surgery specialties, a total of 54 papers concerning AR in spinal applications were found. The articles were then categorized by content and focus.


The 54 papers were divided into six major focused topics: training, proof of concept, feasibility and usability, clinical evaluation, state of technology, and nonsurgical applications. The greatest number of papers were published during 2020. Each paper discussed varied topics such as patient rehabilitation, proof of concept, workflow, applications in neurological and orthopedic spine surgery, and outcomes data.


The recent literature and SSF lectures on AR provide a solid base and demonstrate the emergence of an advanced technology that offers a platform for an advantageous technique that is superior, in that it allows the operating surgeon to focus directly on the patient rather than a guidance screen.


The concept of using the navigation in performing more accurate cranial and spinal procedures extends back to the novel concept of using fluoroscopy and television technology to perform a stereotactic cordotomy in the 1960s [1]. Cranial neuronavigation has become the standard of care and has gained widespread usage in spine surgery. The incorporation of image guidance has been transformative in spine surgery and has increased the accuracy and safety of complex spinal procedures [2,3]. One of the earliest feasibility studies evaluating the use of augmented reality (AR) in a spine phantom demonstrated reproducibility and the usefulness of displaying projections to assist in navigating complex areas of anatomy [4]. The concept of AR stems from the desire to enhance surgical efficacy, safety, and improve outcomes. Some early clinical studies demonstrated efficacy and safety in areas of spine surgery such as intradural tumor surgery [5] and degenerative spine pathology [6]. AR uses integrated computer and camera technology to superimpose a three-dimensional computer-generated image over the normal anatomy of interest in order to facilitate procedures and provide highlighting of structures of interest without direct visualization, where one is working. This has the advantage of allowing the surgeon to focus their vision on the area of surgery rather than looking at another screen or projection. This has gained further interest with the recent FDA approval of the first AR system in spinal surgery, xVision (Augmedics, Chicago, IL).

The Seattle Science Foundation’s (SSF) annual “Spinal Navigation, Emerging Technologies and Systems Integration” meeting highlights emerging and advancing technologies in multiple disciplines of spinal surgery including intraoperative navigation and robotic surgery. The conference, now in its fifth year, has historically brought together surgeons inspired to incorporate novel technologies in the operating room and surgeons reporting the progress of the technology in their own clinical practices. The goal of the meeting is to talk about the benefits of new technologies for surgeons and patients, encourage further development of these technologies in the field, and discuss adoption across multiple institutions. This series of highlights will examine the use of augmented reality throughout the last five years of the conferences and offer insights into the future direction of the use of these technologies.

Materials & Methods

A PubMed search was carried out with these search terms: augmented reality spine surgery, augmented reality neurosurgery, augmented reality orthopedic surgery, augmented reality surgery, mixed reality spine surgery, OR augmented reality spine surgery. Results were limited to 2016 through 2020, so as to correlate with lectures given at the SSF “Annual Emerging Technologies” conferences. The literature review resulted in 132 papers. Inclusion criteria were defined as papers pertaining directly to neurosurgical or orthopedic spine surgery and published within the 2016-2020 time frame. After the exclusion of papers related to non-spine surgery specialties, there were a total of 54 papers concerning AR in the field of spinal surgery. We then categorized by content and focus of each article.

Following the review of the literature, the agendas of the annual “Spinal Navigation, Emerging Technologies, and Systems Integration” meeting were reviewed for lectures focused on the topic of AR. Six lectures were given across the 2016-2020 time period ranging from introductions to the technology to direct clinical applications and outcomes using FDA-approved AR systems. Lectures were reviewed and annotated into a series of highlights documenting the growth and evolution of AR over the past four years.


Of the 54 papers identified in the literature review with queried terms that pertained to spine surgery, there was a steady increase of AR papers published per queried year, with 2 published in 2016, 4 published in 2017, 6 published in 2018, 10 published in 2019 and 31 published in 2020 (Figure 1).

The published AR papers were assessed and categorized by primary focus to include the themes of (1) Training, (2) Proof of Concept, (3) Feasibility & Usability, (4) Clinical Evaluation, (5) State of Technology, and (6) Nonsurgical Applications (Figure 2). Non-surgical papers made up 2% of the literature review and focused on the topic of using AR to aid in the rehabilitation of orthopedic patients [7]. The literature detailed the use of AR to enhance surgeon training [8] and aid in simulation of spinal instrumentation in 9% of the review [9]. More recent papers focused on the use of AR in training were enhanced by the FDA clearance of AR platforms. These papers established a workflow for the use of AR in the operating room and compared the benefits and pitfalls of AR with traditional techniques [10]. Eleven percent of the reviewed papers were categorized as State of the Technology. These papers provided an assessment of the current technological environment of AR and offered future directions for creating a compact, highly versatile, portable AR system that can be applied broadly in the fields of Neurosurgery and Orthopedic surgery [11]. Twenty-one percent of the results were studies focusing on the proof of concept of AR. This included testing efficacy in cadaveric models [12-14] as well as comparing the efficacy of pedicle screw instrumentation in AR with the use of navigation assistance or freehand techniques [15,16]. Feasibility/usability studies were defined as investigations into the practicality, strengths and weaknesses of AR in the operating room (OR), this category accounting for 26% of the query. These studies centered around ergonomics, OR footprint, and accuracy of instrumentation while using AR platforms [12,17-19]. The remaining 30%, with a majority published in 2020 (after FDA clearance to use the xVision AR system) emphasized surgeon’s direct clinical application with AR and the surgical experience and outcomes [5,6,18-20].


AR development

With the advent of intra-operative navigational technology, there has been an increase in the accuracy of pedicle screw placement compared to traditional free-hand techniques [21]. Augmented reality, defined as a computer-generated image superimposed onto a real-world field of view (FOV), looks to improve on existing surgical technologies. As AR continues to gain popularity in the field of surgery, preliminary data has begun to show that it is equivalent to traditional non-AR operative methods at a lower cost [22].

The early years of AR

AR, originally adapted from military application in fighter pilot displays, allowed computer-generated images to be superimposed onto the field of view of the operator [4]. In 2016, Dr. Kris Siemionow reviewed this emerging technology of AR and the foothold that AR was finding in spinal surgery. In his review, the limitations of AR intraoperative navigation were discussed, which included: line-of-sight disruptions between the navigation screen and the operative field, unnatural eye/hand coordination that needed to be developed to successfully utilize navigation, disruptive workflows of different tools and overall limitations of using 2-D images on 3-D anatomy.

Within spinal surgery, AR has been shown to improve surgeon’s accuracy, precision and confidence, while also decreasing implant time and tissue dissection [17,23]. In 2016, there were several AR systems that were being prototyped as ‘proof of concept’ to be adapted into spine surgery. The Google Glass (Google, Mountain View, CA) was one of the 1st generation AR lenses and in their study, Yoon et. al. showed that although this system was reviewed favorably by surgeons, operative time did not change and the limited FOV, lack of image overlay and head tracking had a nauseating effect on surgeons [23]. The “projector approach” to AR, in which an image was projected onto a patient’s body to visualize underlying anatomy, had its limitations as well. The limitation of 2-D images projected on a 3-D surface has line of sight (LOS) issues obstructing the image, and the positioning of the surgeon affecting the view has been well reported [23]. The “reflective mirror” technique, in which a computer-generated overlay was displayed onto a reflective glass above the operative field was also in development. This technology was promising due to accurate anatomical localization in three dimensions as well as tracking of the surgeon’s movements, but the system was deemed overly burdensome to be introduced into an OR. Chen et. al. took the reflective mirror technique and adapted it into a head-mounted system which unsuccessfully resulted in a multi-unit system with multiple sensors, calibration steps, cameras and tools that summated to a large navigational error, a lag effect nauseating the surgeon, as well as difficult calibration and limited focal length while operating [24]. The “tablet-based” approach to AR, in which a tablet camera could display AR-pertinent information, was flawed by the lack of 3D imaging, tablet display detracting from operative field, and difficulty of manipulating instruments while viewing through the tablet. Finally, the HoloLens (Microsoft Corp., Redmond, WA), although capable of 3D-imaging and head tracking, was limited in navigational accuracy, processing power, FOV and comfort [25]. Dr. Siemionow discussed that during these early developmental years, the technology was promising, but AR would not be a viable tool that surgeons added to their regular repertoires.

Recent developments

In 2018, Dr. Camilo Molina discussed a comprehensive AR system that had shown promising results. Highlighting the xVision AR headset, it offered multiple features including a tracking camera mounted directly on the headset (to minimize LOS interruptions), a wireless battery-operated system, an independent navigational system and a series of tool mounts that allowed the xVision system to be agnostic with other manufactured spinal instrumentation systems (Figure 3). Preliminary cadaveric data using xVision in five cadaver torsos was studied with instrumentation from T6-L5 and breach rates graded on the extent and direction of the pedicle screw. The study showed a thoracic screw accuracy rate of 97.1% and a lumbar screw accuracy rate of 96.6% [12]. Compared to traditional freehand screws, reported accuracy of 89%, and manually navigated screws reported accuracy of 96.6%, the xVision system was found to demonstrate equivalent or superior performance [19,20]. xVision AR system began initial steps for clinical use in Israel and was under review by the FDA in the United States.

At the 2019 conference, Dr. Timothy Witham updated the spine community on the developments within the field of AR. At the time of this 2019 talk, the FDA still had not approved the xVision system for commercial use. Dr. Witham began with the Carl et al. study, that had shown the use of AR to superimpose bony landmarks or the outline of tumors on microscopes while operating [26]. It was concluded that AR-assisted microscopic surgery was successful in augmenting a surgeon’s ability to localize anatomy as well as display multiple modes of information to the surgeon [26]. He also referenced Elmi-Terander et al. in their study of a proprietary AR headset that had been used to place pedicle screws in comparison to freehand technique, with regards to accuracy [27]. It was concluded that the use of AR significantly increased precision in placement of screws, while decreasing the incidence of breach rates [27]. At this time, a discussion took place about how surgeons had all reviewed the ease of use and intuitiveness of the AR hardware favorably, stating that the learning curve for all AR systems would be faster than traditional navigation [12]. Additional studies with the xVision system included nine clinical cases in Israel and a cost analysis that suggested AR as a more affordable alternative to traditional navigation.

xVision received FDA clearance in 2020 for clinical use, and Dr. James Lynch spoke of his experience with AR fitting into his practice. Dr. Lynch was the first private practice surgeon to use AR in a community hospital setting within the U.S., with the first documented case being on June 25th, 2020. The workflow was briefly discussed and was found to be similar in setup to spinal navigation with a small ergonomic footprint. At the time of his lecture at “Emerging Technologies,” Dr. Lynch presented his experience with over 50 AR-assisted cases. He demonstrated that AR had led to accurate pedicle screw placements across a variety of instrumentation systems. According to his experience, the learning curve was extremely straightforward with reported proficiency attained at approximately 5 cases. Due to the small ergonomic footprint and a relatively affordable cost of the system, he concluded that the xVision AR system would best find a place in a surgery center where traditional navigation may be too bulky or cost-prohibitive.

The learning curve

With any technology, there is a limitation in the speed of adoption due to novelty and training required to become competent and integrate the system into everyday practice. Dr. Timur Urakov spoke about this process in relation to AR at the 2020 “Emerging Technologies” conference, in which he set out to discuss how AR could be used as an adjunct to already existing technologies instead of being considered an entirely new system workflow (Figure 4). At the University of Miami, Dr. Urakov compared the accuracy of placement of pedicle screws in a cadaveric model. Instrumenting from T1 to pelvis with one side of the cadaver instrumented with fluoroscopy and the contralateral side instrumented with AR assistance, it was found that of the 38 screws placed, fluoroscopy had no breaches while AR-placed screws had 3 major medial breaches and 4 major inferior breaches [14]. The study showed the limitations of current AR systems and suggested that current AR technology should not be used independently, but as an adjunct to already existing technology. His practice then set out to show this concept by 3D-printing adaptors for multiple instrumentation systems and then using them to place screws on sawbones models. Their preliminary data showed that with placement of 60 screws, there were only two low-grade pedicle breaches. Improved accuracy was due to the addition of instrument tracking to the AR system.

The adoption of AR is also more intuitive to those surgeons that use spinal navigation. The visual experience similarities between AR systems and spinal navigation systems are several. With spinal navigation the surgeon is directly viewing the patient’s bony anatomy and instrumentation through a direct-visualization computer screen workstation. The computer screen is typically placed at the head or foot of the bed, away from the patient’s body. AR displays are heads-up displays (HUDs) that show the same information as spinal navigation computer screens, directly onto the surgeon’s AR headset. Dr. Urakov has been able to demonstrate this when he successfully reported on an endoscopic lumbar discectomy with the direct endoscopic images being displayed to the surgeon’s lenses [28]. With direct line of sight of navigation/imaging, surgeons can also apply this technology to more complex cases. Dr. Urakov demonstrated a single-stage lateral/posterior surgery using the Medtronic Stealth Navigation system projecting those images to an AR headset (Figure 4). This allowed the surgeon to fully focus on the operating field instead of shifting focus to a screen away from the operative field multiple times during the case. He concluded that surgeons could maximize their workflow and efficiency by using AR-assisted HUDS to supervise the navigation trajectories of an assistant while working on exposure for a second stage or instrumentation.

Application outside of the OR

Both Drs. Lynch and Urakov emphasized that although AR is promising in the OR, it need not only apply to the spine surgeon with instrumentation. There has been a growing push in health literacy to use tools such as AR to help a patient fully understand the details of their care [29]. Already in the fields of Nephrology and Urology, 3D visualization using AR has shown a higher level of understanding of location, size and surgical options for tumor management [30]. Dr. Urakov has, in his practice, been using AR to show patients their anatomy and explain the details of their case with them. Due to the intuitiveness and ease-of-use with most AR systems, the results have been widely popular among patients, as they get a better understanding (in 3D space) of their pathology and indicated surgical procedures. Dr. Urakov, at the time of his 2020 presentation, with the help of the University of Miami and Magic LeapTM, were working on a patient consultation platform that would convey information primarily through AR. Simulation and education are also possible methods in which AR could be applied to help train the next generation of surgeons. Stefan et al. discuss the use of surgical staff training with traditional and AR-based simulations and the next step forward in medical education [9]. At several residency training centers around the country, surgical residents have been utilizing AR projected HUDS to aid in developing and mastering their surgical coordination and technique. 

Due to the reduced cost compared to navigation, AR may also play a role for surgeons looking to operate in rural parts of the U.S. or even be used on mission trips in resource-limited countries. With traditional navigation computers, cameras and sensors, traditional spinal navigation must be calibrated and present in the operating room for the systems to function properly. This becomes a feasibility issue when sending these systems to areas with limited infrastructure as they may not be able to support these advanced technologies. With further development of AR, a surgeon would simply need to take the AR headset, integrate any imaging modality and use the cross-compatibility seen in multiple AR systems to deploy precise spinal surgery anywhere in the world. 

Future directions 

In 2018, Dr. Molina suggested that AR would be used for minimally invasive applications, and by 2020, there was direct clinical application in Dr. Lynch’s practice in demonstrating that the system lent itself extremely well to accurate placement of pedicle screws in MIS cases.

Dr. Molina also expected AR technology to continue to advance to a point where it would be utilized in complex cases such as atrophic/dysplastic bony anatomy, scoliosis with significant coronal and rotational deformities, tumor resection, transarticular screws and even cranial neurosurgery (Figure 5). Dr. Juan Uribe went one step further in his 2020 lecture in which he talked about using AR as a customizable screen to whatever data the surgeon would require. Moving forward from AR, Dr. Uribe stated that “Enhanced Reality” would soon be on the horizon with real-time improvement/modification of surgical images while operating. Dr. Uribe also estimated that the AR market would triple by 2022 with the pioneering AR system, xvision, competing against multiple other AR platforms for a spot in future ORs.


AR is an emerging technology that has exponentially developed and is continuing to evolve into a superior technology. Its key advantage over robotics and navigated spine surgery is that the surgeon never has to take the focus from the patient. The superimposition of images directly onto the surgical field gives obvious and immediate safety and procedural advantages. As the technology continues to develop, integration of AR should be a tool considered by surgeons parallel to spinal navigation.


  1. Fox JL, Green RC: Stereotaxic surgery using a television guidance system. II. Percutaneous cordotomy. Acta Neurochir. 1969, 21:31-42. 10.1007/BF01405208
  2. Hanna G, Kim TT, Uddin SA, Ross L, Johnson JP: Video-assisted thoracoscopic image-guided spine surgery: evolution of 19 years of experience, from endoscopy to fully integrated 3D navigation. Neurosurg Focus. 2021, 50:E8. 10.3171/2020.10.FOCUS20792
  3. Johnson JP, Drazin D, King WA, Kim TT: Image-guided navigation and video-assisted thoracoscopic spine surgery: the second generation. Neurosurg Focus. 2014, 36:E8. 10.3171/2014.1.FOCUS13532
  4. Weiss CR, Marker DR, Fischer GS, Fichtinger G, Machado AJ, Carrino JA: Augmented reality visualization using Image-Overlay for MR-guided interventions: system description, feasibility, and initial evaluation in a spine phantom. AJR Am J Roentgenol. 2011, 196:W305-7. 10.2214/AJR.10.5038
  5. Carl B, Bopp M, Saß B, Pojskic M, Nimsky C: Augmented reality in intradural spinal tumor surgery. Acta Neurochir. 2019, 161:2181-93. 10.1007/s00701-019-04005-0
  6. Carl B, Bopp M, Saß B, Voellger B, Nimsky C: Implementation of augmented reality support in spine surgery. Eur Spine J. 2019, 28:1697-711. 10.1007/s00586-019-05969-4
  7. Berton A, Longo UG, Candela V, et al.: Virtual reality, augmented reality, gamification, and telerehabilitation: psychological impact on orthopedic patients' rehabilitation. J Clin Med. 2020, 9:10.3390/jcm9082567
  8. Pfandler M, Lazarovici M, Stefan P, Wucherer P, Weigl M: Virtual reality-based simulators for spine surgery: a systematic review. Spine J. 2017, 17:1352-63. 10.1016/j.spinee.2017.05.016
  9. Stefan P, Pfandler M, Wucherer P, et al.: Team training and assessment in mixed reality-based simulated operating room: current state of research in the field of simulation in spine surgery exemplified by the ATMEOS project. Unfallchirurg. 2018, 121:271-7. 10.1007/s00113-018-0467-x
  10. Cho J, Rahimpour S, Cutler A, Goodwin CR, Lad SP, Codd P: Enhancing reality: a systematic review of augmented reality in neuronavigation and education. World Neurosurg. 2020, 139:186-95. 10.1016/j.wneu.2020.04.043
  11. Hussain I, Cosar M, Kirnaz S, Schmidt FA, Wipplinger C, Wong T, Härtl R: Evolving navigation, robotics, and augmented reality in minimally invasive spine surgery. Global Spine J. 2020, 10:22S-33S. 10.1177/2192568220907896
  12. Molina CA, Phillips FM, Colman MW, et al.: A cadaveric precision and accuracy analysis of augmented reality-mediated percutaneous pedicle implant insertion. J Neurosurg Spine. 2020, 1-9. 10.3171/2020.6.SPINE20370
  13. Molina CA, Theodore N, Ahmed AK, et al.: Augmented reality-assisted pedicle screw insertion: a cadaveric proof-of-concept study. J Neurosurg Spine. 2019, 1-8. 10.3171/2018.12.SPINE181142
  14. Urakov TM, Wang MY, Levi AD: Workflow caveats in augmented reality-assisted pedicle instrumentation: cadaver lab. World Neurosurg. 2019, 126:e1449-55. 10.1016/j.wneu.2019.03.118
  15. Buch VP, Mensah-Brown KG, Germi JW, et al.: Development of an intraoperative pipeline for holographic mixed reality visualization during spinal fusion surgery. Surg Innov. 2021, 28:427-37. 10.1177/1553350620984339
  16. Drouin S, Kochanowska A, Kersten-Oertel M, et al.: IBIS: an OR ready open-source platform for image-guided neurosurgery. Int J Comput Assist Radiol Surg. 2017, 12:363-78. 10.1007/s11548-016-1478-0
  17. Elmi-Terander A, Burström G, Nachabe R, et al.: Pedicle screw placement using augmented reality surgical navigation with intraoperative 3D imaging: a First In-Human Prospective Cohort Study. Spine. 2019, 44:517-25. 10.1097/BRS.0000000000002876
  18. Edström E, Burström G, Nachabe R, Gerdhem P, Elmi Terander A: A novel augmented-reality-based surgical navigation system for spine surgery in a hybrid operating room: design, workflow, and clinical applications. Oper Neurosurg. 2020, 18:496-502. 10.1093/ons/opz236
  19. Aoude AA, Fortin M, Figueiredo R, Jarzem P, Ouellet J, Weber MH: Methods to determine pedicle screw placement accuracy in spine surgery: a systematic review. Eur Spine J. 2015, 24:990-1004. 10.1007/s00586-015-3853-x
  20. Burström G, Nachabe R, Homan R, et al.: Frameless patient tracking with adhesive optical skin markers for augmented reality surgical navigation in spine surgery. Spine. 2020, 45:1598-604. 10.1097/BRS.0000000000003628
  21. Elmi-Terander A, Skulason H, Söderman M, et al.: Surgical navigation technology based on augmented reality and integrated 3D intraoperative imaging: a spine cadaveric feasibility and accuracy study. Spine. 2016, 41:E1303-11. 10.1097/BRS.0000000000001830
  22. Vávra P, Roman J, Zonča P, et al.: Recent development of augmented reality in surgery: a review. J Healthc Eng. 2017, 2017:4574172. 10.1155/2017/4574172
  23. Yoon JW, Chen RE, Han PK, Si P, Freeman WD, Pirris SM: Technical feasibility and safety of an intraoperative head-up display device during spine instrumentation. Int J Med Robot. 2017, 13:10.1002/rcs.1770
  24. Chen X, Xu L, Wang Y, et al.: Development of a surgical navigation system based on augmented reality using an optical see-through head-mounted display. J Biomed Inform. 2015, 55:124-31. 10.1016/j.jbi.2015.04.003
  25. Deib G, Johnson A, Unberath M, et al.: Image guided percutaneous spine procedures using an optical see-through head mounted display: proof of concept and rationale. J Neurointerv Surg. 2018, 10:1187-91. 10.1136/neurintsurg-2017-013649
  26. Carl B, Bopp M, Saß B, Pojskic M, Voellger B, Nimsky C: Spine surgery supported by augmented reality. Global Spine J. 2020, 10:41S-55S. 10.1177/2192568219868217
  27. Elmi-Terander A, Burström G, Nachabé R, et al.: Augmented reality navigation with intraoperative 3D imaging vs fluoroscopy-assisted free-hand surgery for spine fixation surgery: a matched-control study comparing accuracy. Sci Rep. 2020, 10:707. 10.1038/s41598-020-57693-5
  28. Liounakos JI, Urakov T, Wang MY: Head-up display assisted endoscopic lumbar discectomy-A technical note. Int J Med Robot. 2020, 16:e2089. 10.1002/rcs.2089
  29. Adapa K, Jain S, Kanwar R, Zaman T, Taneja T, Walker J, Mazur L: Augmented reality in patient education and health literacy: a scoping review protocol. BMJ Open. 2020, 10:e038416. 10.1136/bmjopen-2020-038416
  30. Wake N, Rosenkrantz AB, Huang R, et al.: Patient-specific 3D printed and augmented reality kidney and prostate cancer models: impact on patient education. 3D Print Med. 2019, 5:4. 10.1186/s41205-019-0041-3

Original article

Augmented Reality in Spinal Surgery: Highlights From Augmented Reality Lectures at the Emerging Technologies Annual Meetings

Author Information

Syed-Abdullah Uddin

School of Medicine, University of California Riverside, Riverside, USA

George Hanna

Neurosurgery, Cedars-Sinai Spine Center, Los Angeles, USA

Lindsey Ross

Neurology and Neurosurgery, Cedars-Sinai Medical Center, Los Angeles, USA

Camilo Molina

Neurological Surgery, Washington University School of Medicine, St. Louis, USA

Timur Urakov

Neurological Surgery, University of Miami, Miami, USA

Patrick Johnson

Neurological Surgery, Cedars-Sinai Medical Center, Los Angeles, USA

Terrence Kim

Orthopedic Surgery, Cedars-Sinai Medical Center, Los Angeles, USA

Doniel Drazin Corresponding Author

Medicine, Pacific Northwest University of Health Sciences, Yakima, USA

Ethics Statement and Conflict of Interest Disclosures

Human subjects: Consent was obtained or waived by all participants in this study. Animal subjects: All authors have confirmed that this study did not involve animal subjects or tissue. Conflicts of interest: In compliance with the ICMJE uniform disclosure form, all authors declare the following: Payment/services info: All authors have declared that no financial support was received from any organization for the submitted work. Financial relationships: Terrence Kim M.D., Timur Urakov M.D. declare(s) personal fees from Medtronic. Consultants. Terrence Kim M.D., Timur Urakov M.D. declare(s) personal fees from DePuy Synthes. Consultant. Camilo Molina M.D. declare(s) personal fees from Augmedics. Consultant . Other relationships: All authors have declared that there are no other relationships or activities that could appear to have influenced the submitted work.


We would like to acknowledge the Seattle Science Foundation staff who have developed a strong academic presence and platform that has facilitated our development and dissemination of technological advances in spine surgery.

Original article

Augmented Reality in Spinal Surgery: Highlights From Augmented Reality Lectures at the Emerging Technologies Annual Meetings

Figures etc.


Scholarly Impact Quotient™ (SIQ™) is our unique post-publication peer review rating process. Learn more here.