A research team at Cornell University has developed a device that can be worn as a necklace and tracks facial expressions. The NeckFace uses infrared cameras to capture images of the chin and face from beneath the neck. It’s part of a growing wave of innovations aimed at capturing and expressing emotions in VR. “Current VR implementations have advantages and disadvantages versus other remote communication forms like webcams,” Devon Copley, the CEO of VR company Avatour, told Lifewire in an email interview. “Body language, for example, can be more expressively captured and communicated than by video. But the lack of real facial expressions is a massive loss of communication bandwidth, and these emotion-sensing technologies are really having to compensate for that.”
Tracking Your Face
VR is all about new ways to experience digital environments. But the NeckFace concept could be one way to get more feedback from users. “The ultimate goal is having the user be able to track their own behaviors, through continuous tracking of facial movements,” Cheng Zhang, a Cornell University researcher who was one of the authors of the paper, said in a news release. “And this hopefully can tell us a lot of information about your physical activity and mental activities.” Aside from emotion-tracking, Zhang sees many applications for this technology: virtual conferencing when a front-facing camera is not an option, facial expression detection in virtual reality scenarios, and silent speech recognition. NeckFace also has the potential to change video conferencing. “The user wouldn’t need to be careful to stay in the field of view of a camera,” François Guimbretière, another member of the Cornell research team, said in the news release. “Instead, NeckFace can recreate the perfect headshot as we move around in a classroom, or even walk outside to share a walk with a distant friend.”
Bringing Emotion to VR
Other companies are working to bridge the gap between the real and the virtual worlds. Facebook recently released a paper on “reverse passthrough VR” to make VR headsets less physically isolating. The researchers describe a method of translating your face onto the front of a headset, although it’s only in a testing phase. VR is getting more realistic, but expressing the emotions of users is still a challenge, experts say. “Natural, in-person communication between people includes information channels well beyond the text of utterances,” Copley said. “Tone of voice and body language is crucial, but an often-overlooked and really important aspect of communication is gaze. The direction of an interlocutor’s gaze is massively important.” Many companies are trying to discern human emotion in virtual reality. HP’s new Omnicept headset, for example, tracks pupil size, pulse, and muscle movements. The company MieronVR uses the Omnicept for healthcare applications. “VR has the ability to connect people and build empathy for the self and others,” Jessica Maslin, the president of Mieron, told Lifewire in an email interview. “Self-empathy is connected to much higher levels in self-care and care of future outcomes.” Tracking emotion in VR one day even could help detect whether users will commit criminal acts in the future. “If we can detect emotion, we can create virtual scenarios in which we locate people, in order to understand their risk better,” forensic psychologist Naomi Murphy, who works with VR, told Lifewire in an email interview. “For instance, we could create scenes in which there is fire present to detect how emotionally aroused someone who has a history of arson is pre-and post-treatment.” On the lighter side, emotion tracking could also make gaming more fun. “We’re still learning how to interpret this data correctly, but one can imagine signaling physical states in creative ways such as changing color or even choosing a different avatar, based on the emotional state of the user,” Copley said. “Imagine turning into a vengeful dragon when the various sensors indicate anger.”