Download presentation
Presentation is loading. Please wait.
1
LYU0603 A Generic Real-Time Facial Expression Modelling System Supervisor: Prof. Michael R. Lyu Group Member: Cheung Ka Shun (05521661) Wong Chi Kin (05524554)
2
Outline Project Overview Motivation Objective System Architecture Face Coordinate Filter Implementation Facial expression analysis Face modelling Demonstration Future work Conclusion
3
Project Overview Detect the facial expression Draw corresponding model
4
Motivation Face recognition technology has become more common Webcam has high resolution enough Computation power is high enough
5
Objective Enrich the functionality of web-cam Make net-meeting more interesting Users are not required to pay extra cost on specific hardware Recognize human face generically
6
System Architecture
7
Face Coordinate Filter
8
Our system is based on this filter and built on top of this filter Input: video source Output: the coordinate of vertices
9
Implementation – Face outline
11
Implementation - Calibration WHERE Face mesh coordinates => pixel coordinates
12
Implementation – Facial Expression Analysis We assume that the face coordinate filter works properly Detect eye blinking and mouth opening by coordinate system With sample movies, record the coordinate changes Plot graph Statistic Analysis
13
Implementation – Facial Expression Analysis Using vertex pair (33, 41), (34, 40), (35, 39) Part of Face mesh - Eye
14
Implementation – Facial Expression Analysis
15
Using vertex pair (69, 77), (70, 76), (71, 75) – outer bound of lips Using vertex pair (93, 99), (94, 98), (95, 97) – inner bound of lips Part of Face mesh - Mouth
16
Implementation – Facial Expression Analysis
17
Distance between two vertices < 1 unit There exists other factors affect the difference Distance between camera and user User moves his or her head quickly for reference
18
Implementation – Facial Expression Analysis Three methods Competitive Area Ratio Horizontal Eye-Balance Nearest-Colour Convergence
19
Competitive Area Ratio To detect whether the mouth is opened or closed
20
Competitive Area Ratio We can get the area of the triangle by
21
Horizontal Eye-Balance To detect whether the head is inclined
22
Horizontal Eye-Balance Approach I
23
Horizontal Eye-Balance Approach I However…
24
Horizontal Eye-Balance Approach II
25
Nearest-Colour Convergence Retrieve pixel colour in the specific area Treat pixel colour into three different channel (RGB) Take the average value in each channel
26
Nearest-Colour Convergence Colour space difference: Eye is closed if:
27
Direct 3D The characters we will be used in the system modelling
28
Texture Sampler Eye Closed Mouth Opened
29
Texture Sampler Pre-production of image
30
Texture Sampler Loading the texture Mapping object coordinate to texel coordinates
31
Texture Sampler Prepare the index buffer
32
Facial Expression Modelling Update the object coordinates Normalize the coordinates geometry
33
Our System
34
Demonstration We are going to play a movie clip which demonstrate our system
37
Future Work Improve the preciseness of face detection Use 3-D model instead of 2-D texture Allow net-meeting software to use it
38
Conclusion We have learnt with DirectShow and Direct3D We have developed search methods to detect the facial expressions
39
End Thank you! Q&A
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.