Image: The 3D zSpace display with stylus interaction (Photo courtesy of Harish Mandalika/ the University of Canterbury).
A new study describes a radiology display interface that provides simultaneous two-dimensional (2D) and three-dimensional (3D) visualization of medical images.
Developed at the University of Canterbury (Christchurch, New Zealand), the University of Otago (Christchurch, New Zealand), and other institutions, the 2D/3D desktop virtual reality hybrid user interface focuses on improving the 3D manipulation required in some radiologic diagnostic tasks. The system combines a zSpace (Sunnyvale, CA, USA) AIO stereoscopic virtual reality device with a standard 2D display, mouse, and keyboard, all connected via a single workstation computer.
The 3D component includes a stereoscopic display embedded with motion-tracking cameras, polarized glasses, and a 3D stylus. The wearer of the stereoscopic virtual reality device must first open an image dataset in the 2D part of the interface. Once loaded, a marching cubes algorithm automatically extracts a 3D model from the images in real-time, and displays it on the 3D interface of the virtual reality device. Via the stylus, the image can then be rotated, annotated, measured, and marked for regions of interest. The position and orientation of the object in the 2D and 3D displays are synchronized.
An evaluation of the system by medical students and radiology residents who examined CT scans on the hybrid interface showed that they were able to diagnose scoliosis more accurately than when using a 2D or 3D interface alone. In addition, the 2D/3D hybrid interface was more efficient for the novice users, and more accurate for both novice and experienced users, when compared to traditional 2D interfaces. In addition, diagnostic accuracy of the medical students improved to match that of the residents. The study was published in the February 2018 issue of Journal of Digital Imaging.
“The interface can offer significant advantages to a subset of diagnostic radiology tasks that rely on some level of 3D manipulation, such as measuring the angle/displacement of broken bone segments, measuring brain aneurysms, etc.,” said lead author graduate student Harish Mandalika, MSc, of the Human Interface Technology Lab (HITL) at the University of Canterbury. “I also believe that the hybrid interface would largely benefit novice users and can be used as a learning/training tool.”
University of Canterbury
University of Otago