top of page
Search
  • Writer's pictureAviv Elor

BioLumin: An Immersive Mixed Reality Experience for Interactive Microscopic Visualization

What if we could enable users to shrink down to the microscopic level, walkaround cells, and analyze them for research discovery? This project introduces BioLumin, an immersive mixed reality experience that enables interactive visualization and annotation of 3d reconstructed microscopic images. Subsequently, this project explores the implications of immersive extended reality for research discovery, the design of a pipeline to enable autonomous microscopic reconstruction of mixed reality environments, and the 3DUI interaction system to transform and annotate biomedical data. The prototype was evaluated through a multi-stage mixed methods approach consisting of four experts and nineteen non-expert users to understand the usability and feasibility of BioLumin. Results suggest that experiences like BioLumin may be incredibly impactful in accelerating the research process of analyzing 3d microscopic models for all users regardless of expertise.


Demo Video:





Motivation:

The National Center for Advancing Translational Sciences (NCATS) mission is to accelerate the discovery of new healthcare interventions to treat sickness and disease. As the modern age progresses, hardware power and affordability of computational devices has never been better than the present. Conversely, Eroom’s law finds an inverse effect, where the discovery of new clinical interventions is declining [Hall et al]. Newly affordable emerging technologies such as extended reality may accelerate the discovery of new treatments to break Eroom’s law.


The use of extended reality and immersive experiences has shown demonstratable better effects in improving perception, intuition, and retention of information with big data [Donalek et al]. The bridging of a flexible virtual world enables a variety of new input modalities for the user to manipulate and experience a visualization. Research has shown that extended reality provides a robust environment for collaborative problem solving, informative learning, and cognitive engagement [Dunleavy et al].


Emerging game engines such as Unity3D are becoming ever more flexible in enabling visualization and interaction for industries such as construction, entertainment, government, and healthcare with flexible capabilities to adhere to most any operating and software system [Unity]. Research scientists have reported that the use of a flexible and interactive game engine such as Unity is powerful in tackling biomolecular visualizations challenges [Lv et al]. Given such, there is clear potential in leveraging the immersive capabilities of extended reality to speed up the data visualization and analysis process for biological datum.



Goals:

While much work has been done in exploring extended reality for biological visualization, there has been a lack of experiences that (1) enable user annotation and data manipulation beyond visual transformation, (2) leverage the immersive capabilities of augmented reality, and (3) allow for cross platform capabilities between different mediums of extended reality. Analyzing biological data in the virtual world will benefit from the input of the user to annotate data using the unique input modalities that extended reality devices provide. Augmented Reality, or the extension of virtual reality into the physical world through spatial tracking and occlusive virtual placement, yields a potential to fully integrate within the lab workflow but has been unexplored with these past research studies. Lastly, enabling the ability of cross-platform communication between extended reality and flexible tools such as WebGL and MATLAB will both increase the ability for collaborative research analysis and data review. Enabling researchers to annotate data in the 3D virtual world may help create smarter and faster machine learning models to minimize the need for human intervention to correct data.


This project aims to leverage the immersive capabilities of the Magic Leap One spatial computing device [Magic Leap] to accelerate the analysis and discovery of complex biological datum. Specifically, the goals of this study are three-fold:

  • 1. Interaction: Creation of a mixed reality platform enabling the annotation, transformation, and visualization of biological data through intuitive input modalities.

  • 1a. The ability to manipulate biological data through placement, translation, rotation, and scale.

  • 1b. The ability to enable annotation through labeling, scaling, inserting, deleting, and confirming regions of interest within a reconstructed surface model.

  • 1c. The flexibility of interacting with the virtual world through custom controllers, hand gestures, gaze position, eye tracking, and traditional computer-based input modalities.

  • 2. Data Acquisition: Processing of biological data to be manipulated with the virtual world.

  • 2a. An autonomous pipeline to convert nuclear imaging and raw biological data into 3d manipulatable surface models for unity.

  • 2b. An autonomous pipeline to convert user annotations of surface models into a universal format for other visualization platforms and raw data analysis.

  • 3. Collaboration and Flexibility: Enabling cross-platform capabilities for collaborative and flexible virtual interaction.

  • 3a. The development of a web-based Unity WebGL instance for collaborative viewing, analysis, and review of biological annotations.

This project is one of the first to leverage spatial computing and mixed reality for interactive biological visualization and review through Magic Leap One (MLO) and the Unity Game Engine, with emerging devices such as MLO, new forms of input modalities will leverage intuitive interaction and higher perceptive viewing of data. The flexibility of this platform for interchange between the MLO Operating System, the world-wide-web, and MATLAB like tools enable modular utilities to integrate with future extended reality devices for future biomedical analysis.


Figures:




















































































































Demos:


WebGL Instance of BioLumin


Exportable Annotations from Magic Leap







Contributions and Collaborators:

Independent Contributions

  • Lead and independent developer, prototype designer, and user testing evaluator for BioLumin.

  • Lead pipeline developer for Matlab to Unity autonomous configuration of microscopic data.

  • Lead data analyst in game-play behavior and runtime motion capture data from unity.

Contributions of Co-authors

  • Professor Sri Kurniawan: Advisor, Mentor, and guidance of HCI related protocols and testing.

  • Sam Michael, CIO NIH NCATS: Advisor, Mentor, and guidance of biomedical related analytics.

  • Nathan Hotaling, Data Scientist NIH NCATS: Mentor, guidance of input interaction for NIH scientists.

  • Ty Voss, Imaging Specialist NIH NCATS: Mentor, guidance for microscopic image analytics.


265 views0 comments
bottom of page