Project Butterfly: Controlled Virtual Reality Experience for Physical Therapy and Wearable Robotics
Updated: Mar 1
With new capabilities advancing in virtual reality, we have developed a new immersive virtual reality game and therapy tool, titled Project Butterfly (PBF), which exploits the capabilities of commercially available virtual reality in order to increase accessibility, affordability, and accuracy of physical therapy during Stroke Rehabilitation. PBF can be used remotely to motivate the user to perform precise and accurate rehabilitative motion while capturing behavioral and positonal real-time data. PBF integrated with our CRUX (Compliant Robotic Upper-extremity eXosuit) during upper limb physical therapy. PBF is currently being adapted for machine learning protocols targeted at game difficulty and evaluator adjustments, with the hopes of capitalizing upon PBF’s data generation for the development of an Artificial Intelligence Agent to optimize the results in Stroke Rehabilitation with Constraint-Induced Therapy and Mirror Visual Feedback Therapy.
Can assistive wearable robotics be used to help patients perform therapy using immersive Virtual Reality? How can the game calibrate the level of assistance with the suit and vise versa?
Is long-term iVR physical therapy feasible beyond the VR novelty effect? Will users remained engaged after months of physical therapy with Project Butterfly?
Can emotional response be quantified through biofeedback during physical rehabilitation with iVR? How can iVR stimuli be crafted to both react to and induce an optimal emotional response to elicit greater success in therapeutic goals?
The synergistic meshing between the physical and the virtual world to optimize therapeutic outcome in physical therapy may be in the near future. Games can inform of robotic assistance and biofeedback can inform games to tailor each experience to individual emotional and physical response. The future of healthcare may be one augmented by a cybernetic physical-virtual experience aimed at healing each user to their own individuality and response.
Specific Computational System-Building Goals:
The goal of Project Butterfly is to create a controlled immersive media environment for the adaptable and translatable therapeutic movement, which incorporates runtime data feedback on player movement performance and behavioral analysis. Its aims are to bridge the gap between therapists and evaluators with users undergoing repetitive exercise and physical therapy through mapping movement by motion capture to gamified scenarios such as protecting a virtual butterfly and catching crystals. Expanding upon the work of Project Star Catcher by Elor et al. 2018 TACCESS, this experience translates Mirror Visual Feedback Therapy into an immersive virtual reality environment that requires users to protect a virtual butterfly. The game was designed utilizing the Unity3D Game Engine, HTC Vive, and Git over three iterations in the course of a year informed by user testing:
Iteration One: A user protects a butterfly from heavy rain by covering the butterfly with an umbrella. Movements are pre-scripted to follow basic motion primitives for bicep curls, shoulder rotations, forward arm raise, and lateral arm raise. This iteration was tested and iteratively designed through weekly focus groups with Hope Services California over the course of one month.
Iteration Two: A user protects a butterfly from heavy rain by encasing the butterfly within a spherical orb. Movements are pre-scripted to follow the same basic motion primitives as iteration one. A logfile data collection was implemented using Microsoft .NET I/O Framework in C# to stream player pose, butterfly pose, time, and score to a CSV file during runtime gameplay from the Unity Engine. An evaluator interface was created to set the speed of butterfly, length of the user's arm, username, and sample rate of logfile data collection. This iteration was tested with Hope Services, Elderday Retirement Home, and Cabrillo College Stroke and Disability Learning Center. Additionally, a pilot study was run with the CRUX tensegrity exosuit to assist users with limited movement in gameplay; such results can be seen in Elor et al. 2019 IEEE VR.
Iteration Three: A user protects a butterfly from projectile crystals by encasing the butterfly within a spherical orb. The crystals help indicate the butterfly path to the user and add more dynamic objects to the environment. Movements can be customized and recorded before gameplay by holding down the trigger of the HTC Vive controller and performing the desired motion to prescribe. A calibration is run in the main menu to determine user arm length by comparing pose between controllers and HMD. An "Auto Mode" was enabled to allow users to play at home without research evaluators: the game begins by the user placing their controller on the butterfly for 5 seconds, which loads a set of customized movements that can be remotely set by the therapist or research evaluator. Exercises switch after a default ten repetitions and default 60 second rest period. All variables pertaining to repetitions, exercise type, speed, order, and movement can be customized locally and or remotely through the evaluator interface and prescribed for at-home use or clinical use. This iteration has just completed long term testing of bi-weekly sessions with five users undergoing upper-extremity rehabilitation over the course of two months. Additionally, a remote user was tested within their personal homes through remote deployment and using the "Auto Mode." Results and dissemination of iteration three are currently under review at an international conference.
Subsequently, Project Butterfly requires the user to closely follow physical movements prescribed by therapists by encasing the butterfly in a bubble. The design and evaluation of this experience have gone through multiple iterations informed of local disability learning centers in Santa Cruz, California, and pilot results have been reported in Elor et al. 2019 IEEE VR to analyze its synergy with wearable assistive robotics. An additional parsing and visualizing pipeline was implemented through Mathworks Matlab 2018b by utilizing the statistical and visualization toolboxes: this enabled analysis of mean, median, standard deviation, standard error, Wilcoxon significance, ANOVA, and Kruskal Wallis analysis of player pose, behavior, and survey results.
The long-term use of this system for upper-extremity physical rehabilitation from a game performance and biofeedback perspective as well as creating pipelines for high-level movement analysis and muscle forces through the OpenSim Biomechanics Software are under review. Resultantly, Project Butterfly enacts an immersive media sandbox to test mapped movements and player adversity to physical recovery.
Project Butterfly CITRIS User Testing:
Early CRUX / Project Butterfly Alpha Gameplay Demo
Lead developer of Project Butterfly: including unity development for 3DUI interaction, game behavior, runtime data collection.
Lead designer of user experience testing including protocol design, experimental evaluator, and qualitative observer.
Independently performed data analysis on user testing and gameplay, lead author for dissemination and reporting of results with Project Butterfly.
Dr. Steven Lessard, Lead Developer of the CRUX Tensegrity Soft Exosuit
Professor Sri Kurniawan, Advisor, Mentor, and guidance of HCI related protocols and testing.
Professor Mircea Teodorescu: Advisor, Mentor, and guidance of system engineering related architecture and framework.
Elor, A., & Kurniawan, S. (2020, August). Deep reinforcement learning in immersive virtual reality exergame for agent movement guidance. In 2020 IEEE 8th International Conference on Serious Games and Applications for Health (SeGAH) (pp. 1-7). IEEE.
Elor, A., Lessard, S., Teodorescu, M., & Kurniawan, S. (2019, March). Project Butterfly: Synergizing Immersive Virtual Reality with Actuated Soft Exosuit for Upper-Extremity Rehabilitation. In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) (pp. 1448-1456). IEEE.