With the introduction of new wearables designed for intraoperative use, such as the Hololens (Microsoft) head-up display, there has been considerable interest in bringing augmented reality to surgical procedures in recent years. At the North American Spine Society (NASS) annual meeting (24–29 September, Los Angeles, USA), Jang Yoon (University of Miami Miller School of Medicine, Miami, USA) enthused about the future of wearable technology in spine, arguing that it would transform current practice.
“I felt like kid in a candy store or a nerd in an electronic store while strolling through the exhibit hall at NASS,” he wrote in a recent editorial.
Yoon spoke about his journey into augmented reality through his interest in wearable computing devices, and their surgical application in spinal navigation. He noted that current technology used in spine navigation remains screen-based. Yoon described a drawback of this technology: “If you are placing a pedicle screw, you [spine surgeons] get the perfect starting point anatomically, but you take your eyes off to look at the screen and the instrument slips.”
He elaborated, “In surgical procedures, especially in procedures where image guidance is utilised, the surgeons have to look back and forth between the surgical site (patient) and the navigation screen, which is situated away from the patient. This creates an uncomfortable and disconnected workflow for the surgeons. In order for surgeons to use navigation information, they have to look away and take their attention away from the surgical procedure. This happens in cases where fluoroscopy is used as well. To solve these distraction issues during the surgical procedures, there has been much interest in utilising head-up displays to display information of interest directly in the line of surgeon’s sight. The use of head-up displays eliminates a break in surgeon’s sight to check imaging during surgical procedures.”
Therefore, Yoon and colleagues decided to create a device that was able to present the same information, but broadcast to a phone or tablet. His company, MedCyclops, owns a device that connects to a navigation system that can capture and transfer these images to a heads-up display device, such as Google Glass technology. Following some software modifications to Google Glass, it is able to stream the videos in real time (with a delay of <0.4 seconds). The software that streams videos is initiated using voice alone, meaning the surgeon does not have to break sterility. Yoon also adds that this allows the surgeon to remain focused during procedures.
According to Yoon, when comparing the amount of time it takes to place a screw, it was 15% quicker when using a wearable device as opposed to open navigation—however he alluded to a notable amount of bias in this finding.
Furthermore, Yoon pointed to a particular limitation of the glasses, stating that as the screen is very small and transparent, it will require further development. However, addressing the NASS audience with regards to what the future holds for this technology, Yoon said: “There is a lot of interest in applying holograms and actually overlaying the navigation information onto the patient themselves.
“This is just a glimpse of what can be possible in the future. For neurosurgery and spine surgery, sub-millimetre accuracy is required,” Yoon continued, explaining that it first starts with making “anatomically super-accurate maps” by implementing a machine learning algorithm to create holograms based on 2D MRIs.
In relation to where we are at now, Yoon concluded that this augmented technology may play a role in the next-generation of navigation, with probably improvements in the gaming industry driving its evolution. “Eventually, it will cross over into the surgical field and disrupt this landscape”, he said.