This project addresses communication and team training through the development of a set of virtual simulations using the HyperSkill platform developed by SimInsights. HyperSkill provides the capability to replicate/simulate the clinical environment within the virtual reality space, without the need to write code. Furthermore, the immersive training provides users with ideal verbal responses to challenging situations through the real-time feedback panel.
Our aim was to evaluate the usability and performance of the platform as compared to in-person training.
Communication and teamwork failures between healthcare team members are responsible for up to 70% of medical errors because members of healthcare teams come from many different disciplines and isolated education and training programs. It is proven that training in communication and teamwork skills aids in improving teamwork, clinical performance and patient outcomes. Healthcare team training is effective for a variety of healthcare outcomes, including trainees’ perceptions of the usefulness of team training, acquisition of knowledge and skills, demonstration of trained knowledge and skills on the job, and patient and organizational outcomes.
Traditional communication and team training sessions have consisted of classroom-based didactic presentations and/or resource-intense, immersive simulator-based programs requiring in person attendance and facilitated debriefing. These methods have various logistic challenges such as assembling individual team members together to train in person. Furthermore, there are limited opportunities to apply newly acquired skills within relevant contexts, repeat practice and feedback, and follow-up to assess skills acquisition and retention.
SimInsights used existing recorded media to create new storyboards and flowcharts, which were implemented using HyperSkill to create three immersive 3D simulations. The clinical environment was replicated in the VR space by visiting a UCLA clinic and capturing reference images using a 360 camera which were provided as references to 3D artists. Additionally, a feedback panel was developed to debrief the learners by providing ideal responses, scoring learner responses, and measuring their confidence.
For each of the three scenarios, Natural Language Processing based intent detection classifiers were trained to map user speech to fixed utterance labels for each scenario. The transfer learning capabilities of the Nvidia TAO toolkit were used to fine-tune a pre-trained Bidirectional Encoder Representations from Transformers (BERT) model for intent classification.
A total of 815 users completed the training program and provided feedback using the survey questions. The users provided feedback on different parameters of the training software, as shown in the chart. Based on the performance metrics, it can be seen that the majority of learners felt that the training program provided them with the necessary tools to speak up and that they will be able to apply the lessons in practice.
The above-mentioned case-study was accepted at various conferences, where representatives of UCLA health or SimInsights presented the findings.
Date: 27-29 April 2020
Title: 3D Immersive, Interactive and Intelligent Training for Clinicians
Presenter: Rajesh Jha, CEO, SimInsights, Inc.
Los Angeles, CA
Date: January 15-19, 2022
Title: Medical Chaperone E-Learning Module with Virtual Reality Simulations
At IMSH 2022, representatives from UCLA School of Medicine presented AI-Powered clinician training experiences authored using HyperSkill.
UCLA School of Medicine representatives Yue Ming Huang and Miguel Drayton presenting HyperSkill based chaperone simulations at IMSH 2022
Date: October 17-20, 2021
Title: Simulation Training to Empower Medical Chaperones in Speaking Up for Patient Safety
Presenters:
Yue Ming Huang, EdD, MHS
UCLA Health
Miguel Drayton, MFA, MS
UCLA Health
Cookie | Duration | Description |
---|---|---|
cookielawinfo-checkbox-analytics | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics". |
cookielawinfo-checkbox-functional | 11 months | The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional". |
cookielawinfo-checkbox-necessary | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary". |
cookielawinfo-checkbox-others | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other. |
cookielawinfo-checkbox-performance | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance". |
viewed_cookie_policy | 11 months | The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data. |