iSAM: Personalizing an Artificial Intelligence Model for Emotion in Immersive Virtual Reality
Updated: a day ago
How could healthcare be personalized if we could understand how patients feel as they undergo rehabilitation? This paper aims to explore such a new method of healthcare by prototyping a playable experience that utilizes immersive virtual reality, affective computing, and artificial intelligence. Through adapting the NIMH CSEA International Affective Picture Database, we fine-tune an AI-driven pleasure-arousal-dominance emotion model around the user as they inspect a gallery of “lost memories.” The model learns from their interactions and eventually ends the game to intelligently produce a final image indicating a “rediscovered” memory corresponding to the emotional happiness group. We evaluate our playable experience, the Untitled Memory Project (UMP), with an initial pilot study of four users. Our results suggest that UMP is a successful medium in inducing the happy emotion through AI curated images while also learning from user emotional response. We conclude this paper with a discussion of affective games such as UMP and share the implications of this work for other researchers interested in AI-driven immersive healthcare experiences.
Through the Untitled Memory Project, we presented a novel playable experience that employed AI and an immersive virtual environments to learn from and adapt to the PAD emotional model. This early prototype demonstrated a pipeline to enable both users and AI analysis of the International Affective Picture Database through a Virtual Reality interface that transported users into a mind museum to help SAM recovery its memories. Our initial play-testing indicated that the AI model was able to improve its PAD emotional prediction for the majority of users through ten training photos from the IAPsand a final curated photo from google images that were intelligently selected based on the user’s PAD responses. The majority of users indicated that they found UMP to be exciting and that the final image felt happy. While emotion states are often considered crucial elements of mental health, they are not often explored or monitored in the modern healthcare context. Subsequently, this work indicates that it may be possible to bridge runtime emotional models into a virtual environment which may have substantial implications for the VR exposure therapy community or any researchers interested in translating effective models into virtual environments through game engines and AI. While we feel that this is a step in the right direction, more work must clearly be done to verify the efficacy of UMP and AI informed experiences that drive off of the Pleasure-Arousal-Dominance emotional model. Subsequently, there are far more galleries of the mind to explore and memories to recovery in the road ahead.
Elor, A., & Song, A. iSAM: Personalizing an Artificial Intelligence Model for Emotion with Pleasure-Arousal-Dominance in Immersive Virtual Reality. In 2020 15th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2020)(FG) (pp. 583-587).
Contributions and Collaberators:
Lead experience developer, prototype 3DUI designer, and user testing evaluator for Untitled Memory Project.
Designed rule-based inference system for PAD emotional model to be co-used with AI model.
Lead data analyst in game-play behavior and runtime motion capture data from unity.
Contributions of Co-authors
Asiiah Song, Computational Media Ph.D. Student, Lead AI developer for python server, resnet model, and database handling.