I’m a pioneer in the self-coined* field of “expression science.” Using the Facial Action Coding System (FACS certified since 2012) as a descriptive tool, I teach artists, engineers, and researchers how to build and break down facial movements and emotions. My goal is to push innovation of expressions in art and technology to higher levels of accuracy and higher standards of ethics. I’m passionate about sharing my knowledge, mitigating false face news, and giving people the tools they need to create or track salient expressions.
My blog, “Face the FACS,” has an international fanbase, and my custom-made FACS Cheat Sheets are used by artists, researchers, students, and engineers from all over the world.
I am a huge proponent of cross-disciplinary approaches and am determined to advance our understanding of facial expressions by synthesizing research findings from as many fields as possible – plastic surgery, psychology, dermatology, neuroscience, animal behavior, evolutionary biology, micro-anatomy, cosmetology, and beyond.
While most of my recent work has involved teaching character artists, riggers, and animators from various industries to achieve higher levels of expressivity, I also have an established background in data needs and data quality for computer vision machine learning from my four years at Facebook/Oculus. While at Facebook/Oculus I . . .
- was one of the three founding members of Oculus’s Face Tracking Team
- spearheaded all data-related initiatives
- developed the labeling and capture techniques now used by multiple teams across Facebook
- developed procedures, instructions, and documentation for data collection
- led labeling initiatives from small to large scale
- trained and managed vendor labelers
- influenced the design for all Facebook smileys (FB emoji and FB reactions)
While I have retired from manually classifying data myself, I am available for consultation regarding data set quality and can provide labeling instruction documents for labelers (of any experience level) and management services for expression data quality maintenance.
If you would like to embed my FACS articles and cheat sheets into your company’s internal site, please email: facetheFACS@melindaozel.com
* “Expression science” term co-coined with Career Coach, Gretchen Hellman.
* * *
* * *
FACE TRACKING & AVATARS
- Expressive Avatars
- directed facial expression shapes for entire expression library
- advised engineering on triggers for ambient facial expressions
- Oculus LipSync
- compiled viseme (visual phoneme) reference library for Oculus and its developers
- determined shape specifications for each viseme
- acted as the face and model for the Oculus demo avatar
- Pixel Light Effects Avatars
- worked with team to develop use-case-specific digital human library of expressions
- advised on reverse engineering & reconstructing facial poses
EMOJI & DESIGN
- Facebook Care Reaction
- worked with the Facebook Emoji Team to design the new Care Reaction
- advised how to portray emotion and key facial expressions through simple design
- advised animators on ideal expression sequences
- Facebook Emoji
- advised emotion & expression design
- championed updated emoji set that encourages cross-platform standardization of key expressions (See Mindful Design: Everything You Did and Didn’t Want to Know About Emoji)
- Facebook Newsfeed Reactions
- advised emotion design & animation sequence
- Facebook Feeling/Activity
* * *
For lectures, workshops, or consulting, please reach out: facethefACS@melindaozel.com