ARKit to FACS: Blendshape Cheat Sheet
If you or your team are using open-source face tracking kits, figuring out what’s what can be challenging. Navigate the ambiguity with the FACS Translation Sheet!
If you or your team are using open-source face tracking kits, figuring out what’s what can be challenging. Navigate the ambiguity with the FACS Translation Sheet!
Can we really measure smile authenticity? An exploration of the common assumptions we make about expressions of emotion.
Faces of discomfort often followed headset adjustment – or predicted upcoming adjustments. Bored faces and faces on the contempt spectrum tended to be predictive of undesirable experiences later disclosed during the post-demo interviews. These expressions were not just useful for predicting events. They also served as points for further investigation.
We seem to subscribe to the popular oversimplification that machines are less biased than humans; however, if you are familiar with the ways in which machines are trained to read and focus on different aspects of data, you will know: It’s just not that simple.
Anatomical variation is a surprisingly ignored consideration for face tracking/facial mocap in tech and entertainment. Simplified anatomy diagrams are often accepted as universally applicable to all faces and few further questions are asked.
The reality is: FACIAL MUSCLES ARE HIGHLY VARIABLE.
Masterclass excerpt and links from recent webinar with CAVE Academy, via Visual Effects Society.