The Oculus Rift (now owned by Facebook) is very well known in the virtual reality world—the headsets let gamers play in virtual 3D worlds, and interact using avatars. Now, it appears the team is ready to take the idea to another level by allowing facial expressions of the wearer to be captured and then displayed on the face of an avatar for others in the virtual world to see—in real time.

Adding the ability of actual facial representation to applications, would in theory make the experience more life-like, while still allowing the privacy or fantasy part of representation to take place—it might even bolster the illusion of altered reality, where participants actually forget that the being they are interacting with is not a real representation of another human being.

The new technology has come about due to collaboration between researchers in the Rift division and a team at the University of California, led by Hao Li. They came up with a two part design—the first involved placing sensors in the foam padding that covers the forehead of a user wearing a headset, that allowed for capturing brow and some eye muscle movement. The second was a little less elegant, they attached a short adjustable boom to the headset which allows for a camera to be poised over the mouth, thus allowing for capturing movement. The data from both sensors is collected and analyzed by software running in the headset and which converts it to data that is sent to the avatar rendering component.

Li told the press recently that the goal of the project was strictly research based, though he suggests the headset add-ons could be modified to make them more user friendly. He also noted that in its current format, the system requires an initial 10 minute calibration process that requires the user to engage in facial contortions sans ace covering—and to wait again as another much shorter calibration routine runs after the face covering is attached, before the user is ready to go.

Li and his team plan to continue working on the headset add-ons, refining the design and perhaps looking into ways to recreate a person's hairstyle.

More information: Facial Performance Sensing Head-Mounted Display: www.hao-li.com/Hao_Li/Hao_Li_-_publications_%5BFacial_Performance_Sensing_Head-Mounted_Display%5D.html

Research paper (PDF): www.hao-li.com/publications/pa … ggraph2015FPSHMD.pdf