Presentation 1: SterileAR: Evaluation of Augmented Reality and Computer Vision Approaches for Real-Time Feedback in Sterile Compounding Training (Full Paper #50)
Authors: Dmitriy Babichenko, Ravi Patel, Lorin Grieve, Patrick Healy, Eliza Littleton, Nicole Donnellan, Edward Andrews and Stephen Canton
>>Access Video Presentation<<This paper presents preliminary work on the development of SterileAR, an augmented reality feedback platform for providing real-time feedback to pharmacy students during training in sterile pharmaceutical compounding procedures. Simulation pedagogy requires that the learner receive immediate feedback to best recognize procedural errors as well as to internalize the learning objectives. Instructors and task trainers are often utilized in tandem to allow deliberate practice for novices in specialized clinical procedures. These methods, however, are limited by scarce institutional resources and system limitations to adaptation. In this work we explore the possibility of augmenting sterile compounding procedure training with real-time feedback using augmented reality (AR), machine learning, and computer vision technologies. We present our approaches to developing SterileAR in four iterations, including specific descriptions of successes and failures, methodologies for capturing hand and object spatial position and motion data using a headset-mounted mobile phone, a webcam, and an infrared camera. We also describe training machine learning models for object recognition and tracking and utilizing these models to provide learners with real-time feedback.
Presentation 2: Magnetic Resonance Imaging Visualization in Fully Immersive Virtual Reality (Short Paper #24)
Authors: Hubert Cecotti, Quentin Richard, Joseph Gravellier and Michael Callaghan
>>Access Video Presentation<<
The availability of commercial fully immersive virtual reality systems allows the proposal and development of new applications that offer novel ways to visualize and interact with multidimensional neuroimaging data. We propose a system for the visualization and interaction with MRI scans in a fully immersive environment in virtual reality. The system extracts the different slices from a DICOM file and presents the slices in a 3D environment where the user can display and rotate the MRI scan, and select the clipping plane in all the possible orientations. The 3D environment includes two parts: 1) a cube that displays the MRI scan in 3D and 2) three panels that include the three views: axial, sagittal, and coronal, where it is possible to directly access a desired slice. In addition, the environment includes a representation of the brain where it is possible to access and browse directly through the slices with the controller. This application can be used both for educational purposes as an immersive learning tool, and by neuroscience researchers as a more convenient way to browse through an MRI scan to better analyze 3D data.