Sage Publishing is a global leader in educational and academic publishing. Sage have been exploring the potential of immersive technologies in education. Working with immersive technology consultants Somewhere Else, Sage identified a specific challenge in psychiatric nursing training: students need a new way of practicing their soft skills and building their self-confidence before transitioning from the classroom to hospital.
This innovative work required a company with the design and technical expertise in immersive technology with a can-do mindset – and that’s where Mbryonic came in as a delivery partner.
Building on a body of research on virtual embodiment by the likes of Mel Slater’s EventLab and Stanford University’s Virtual Human Interaction Lab, we opted for a body-swapping format. This body swapping role-playing exercise, co-written with Dr.Sue Barker from Cardiff University, recreates a realistic interaction between Susan, a patient suffering from clinical depression and a mental health nurse.
BodySwaps lets the user embody both the nurse and the patient. It allows one to understand and adapt to a patient’s situation from both sides, build empathy and improve their knowledge around patient care.
Even though this was a proof of concept, an important design criteria for development was that we could easily build and adapt new scenarios later. This meant making the app data-driven to allow us to easily interchange environments, characters and scripts.
The user interacts with the Unity application through both movement and voice. As the student is not wearing a motion capture suit during use, we used the Oculus Rift‘s touch controllers and in built microphone to capture their response. This was used to drive their avatar’s body and facial movements automatically.
The characters were created using an off-the-shelf avatar design package, as well as the voices being performed by professional actors and later animated using our own AI to mix together motion-captured data for the body (captured using the Perception Neuron performance capture system). This was done with user input that responds to eye contact, emotion and interjections during speech. The user would additionally get a warning if they aren’t paying attention.
Finally we integrated Google’s Natural Language and Beyond Verbal’s Speech APIs, to provide the user a detailed scorecard of their performance taking into account semantic data of their response, attention and even tone of voice.
UCL Medtech students had the opportunity to experience BodySwaps – The Susan Project as part of their medical research on virtual embodiment. Their research aim was to measure the effect of virtual embodiment on participants’ self reported empathy, self awareness and the likelihood of behaviour change in similar future situations.
We got successful feedback, with participants stating “it will be a useful experience for students before actually seeing the patient”, “a really good training experience for future nurses, therapists or doctors” and “I feel like VR technology helps you develop empathy towards people you don’t know”.
Want to learn how VR or AR can benefit your organisation, or have a brief you'd like a quote on?
Our friendly experts are here to help. Fill in your details and we'll get right back to you.