Virtual Avatars
Project Description:
Virtual Avatars is a virtual experience designed to explore mitigating implicit biases through combining variations in virtual avatars’ appearances and their virtual environments.
Our application begins with a start screen. The user must press the start button to begin the virtual avatar experience.
The user is placed in a scenario where they are placing an order with a barista (the virtual avatar) in a coffee shop. The barista then mispronounces the user's name and the user must choose how to respond to the barista using the button choices onscreen to select what to say. Based on the user's selection, the situation may increase or decrease the user's implicit biases and escalate or de-escalate the situation.
My Role
Group Student Project for an Intro to Human-Computer Interaction course at University of Delaware
Creating an application in Unity (including a 3D virtual avatar and an interactive 3D virtual environment)
Researching, storyboarding, prototyping and presenting deliverables
Timeline
August - December 2021
Language/ Software Used
Unity, Adobe Mixamo, Visual Studio Code, C#
PROBLEM
There is a growing deficiency in understanding the perspectives of people of different races/ethnicities.
As we re-acclimate to social interactions following the pandemic and other issues of racial injustices, we must be aware of how our actions may affect those around us.
SOLUTION
We need practice mitigating implicit racial biases in different social environments.
The initial goal of the virtual avatars project was to create an application that allowed users to interact with a virtual avatar in a virtual environment simulating real-world situations. A user would be able to practice how to deal with certain situations that may propel or lessen implicit biases depending on the user's actions.
MOTIVATIONS
Previous Research
One experiment demonstrated that embodying light-skinned users in a dark-skinned virtual avatar significantly decreased implicit racial bias against dark-skinned people, when compared to embodying light-skinned, purple-skinned, or no virtual avatars. [1]
Other research shows that virtual environments can influence users' feelings and how they perceive virtual avatars. [2]
NEED FINDING
Pre-Survey
This Google Forms survey (around 7 - 10 questions total) was intended to be taken by participants before they used our application. It asks questions about participants' backgrounds, experiences with implicit biases and gauges their familiarity with AR or VR experiences. Some example questions are shown below:
Instructors & Mentors
We utilized our instructors and mentors as a means for subject-matter expertise. We frequently met with our mentors in the HCI Lab throughout the project development and our instructor gave us regular feedback on changes we had to implement in our project.
Post-Survey
This survey (3 questions total) was to be taken by participants after using our application. It asks about their experience and if they would recommend it.
PROTOTYPING
Storyboarding
We storyboarded our project using Miro. Our prototype mainly focused on showing the user interacting with the virtual avatar in a virtual environment.
IMPLEMENTATION
Hardware
Our initial goal was to use either Oculus VR or Windows Mixed Reality headsets to allow our users to interact with the virtual environments. Our final product ended up not implementing VR, and was instead presented as a Windows 64-bit system executable application.
Software
We created our final application in Unity, using packages recommended to us by our mentor. We used Adobe Mixamo to get a base virtual avatar which we then customized in Unity. We also wrote C# scripts in Visual Studio Code to implement audio capabilities in our virtual avatar.
CHALLENGES
Setbacks
We planned to develop our application to be implemented through VR, then AR, and finally we settled with developing for a Windows executable due to time constraints, learning curves and limited resources. Our team's individual schedules made it difficult to collectively meet at the HCI Lab, where we only had one machine available to work on our project & perform testing. At one point in development, there was also an issue with a Unity update erasing some of our previous work that allowed for virtual avatar facial movements. To combat these setbacks, we had to schedule additional times to meet in the HCI Lab outside of our usual schedules, which our instructors and mentors supported us in. Finally, we were able to regain most of our lost progress, but due to time constraints, our final application did end up not having facial movements to match the audio of our virtual avatar's speech.
Conclusions & Future Work
Despite these challenges, we managed to deliver a working product that allows users to interact with a virtual avatar in a virtual environment, which is at the core of what we set out to do. We also gained a lot of experience working with Unity as complete novices, and by the end of the project, we all learned how to create virtual avatars, create virtual environments, customize virtual avatars and add speech and facial movement capabilities to virtual avatars within Unity.
In the future, we would like our project to further explore mitigating implicit biases by adding in different virtual environments to simulate various social situations. Another goal would be to implement VR or AR capabilities as we initially planned in this project, although in its current state, VR would be the easiest to implement of the two.
REFERENCES
[1] Tabitha C. Peck, Sofia Seinfeld, Salvatore M. Aglioti, Mel Slater, Putting yourself in the skin of a black avatar reduces implicit racial bias, Consciousness and Cognition, Volume 22, Issue 3, 2013, Pages 779-787, ISSN 1053-8100, https://doi.org/10.1016/j.concog.2013.04.016.
[2] Katharina Legde and Douglas W. Cunningham. 2019. Evaluating the Effect of Clothing and Environment on the perceived Personality of Virtual Avatars. In <i>Proceedings of the 19th ACM International Conference on Intelligent Virtual Agents</i> (<i>IVA '19</i>). Association for Computing Machinery, New York, NY, USA, 206–208. DOI:https://doi.org/10.1145/3308532.3329425