ARKit to FACS: Blendshape Cheat Sheet

If you or your team are using open-source face tracking kits, figuring out what’s what can be challenging. Navigate the ambiguity with the FACS Translation Sheet!

Faces You Don’t Want to See During UX Research – Especially For VR

Faces of discomfort often followed headset adjustment – or predicted upcoming adjustments. Bored faces and faces on the contempt spectrum tended to be predictive of undesirable experiences later disclosed during the post-demo interviews. These expressions were not just useful for predicting events. They also served as points for further investigation.

Bias In Emotion Tracking

We seem to subscribe to the popular oversimplification that machines are less biased than humans; however, if you are familiar with the ways in which machines are trained to read and focus on different aspects of data, you will know: It’s just not that simple.

Leveraging Facial Muscle Variation

Anatomical variation is a surprisingly ignored consideration for face tracking/facial mocap in tech and entertainment. Simplified anatomy diagrams are often accepted as universally applicable to all faces and few further questions are asked.

The reality is: FACIAL MUSCLES ARE HIGHLY VARIABLE.

Designed for studios and teams

Let's talk.

facetheFACS@melindaozel.com