Project 4: Facial Image Analysis with Support Vector Machines

Slides:



Advertisements
Similar presentations
Introduction to Support Vector Machines (SVM)
Advertisements

(SubLoc) Support vector machine approach for protein subcelluar localization prediction (SubLoc) Kim Hye Jin Intelligent Multimedia Lab
INTRODUCTION TO Machine Learning 2nd Edition
SVM - Support Vector Machines A new classification method for both linear and nonlinear data It uses a nonlinear mapping to transform the original training.
SVM—Support Vector Machines
Model Assessment, Selection and Averaging
Face Recognition & Biometric Systems Support Vector Machines (part 2)
Software Quality Ranking: Bringing Order to Software Modules in Testing Fei Xing Michael R. Lyu Ping Guo.
Classification and Decision Boundaries
Support Vector Machines (SVMs) Chapter 5 (Duda et al.)
Sketched Derivation of error bound using VC-dimension (1) Bound our usual PAC expression by the probability that an algorithm has 0 error on the training.
SVM (Support Vector Machines) Base on statistical learning theory choose the kernel before the learning process.
Face Processing System Presented by: Harvest Jang Group meeting Fall 2002.
July 11, 2001Daniel Whiteson Support Vector Machines: Get more Higgs out of your data Daniel Whiteson UC Berkeley.
CLassification TESTING Testing classifier accuracy
Efficient Model Selection for Support Vector Machines
Prediction model building and feature selection with SVM in breast cancer diagnosis Cheng-Lung Huang, Hung-Chang Liao, Mu- Chen Chen Expert Systems with.
Support Vector Machines Mei-Chen Yeh 04/20/2010. The Classification Problem Label instances, usually represented by feature vectors, into one of the predefined.
GA-Based Feature Selection and Parameter Optimization for Support Vector Machine Cheng-Lung Huang, Chieh-Jen Wang Expert Systems with Applications, Volume.
Applying Statistical Machine Learning to Retinal Electrophysiology Matt Boardman January, 2006 Faculty of Computer Science.
CROSS-VALIDATION AND MODEL SELECTION Many Slides are from: Dr. Thomas Jensen -Expedia.com and Prof. Olga Veksler - CS Learning and Computer Vision.
Ohad Hageby IDC Support Vector Machines & Kernel Machines IP Seminar 2008 IDC Herzliya.
컴퓨터 과학부 김명재.  Introduction  Data Preprocessing  Model Selection  Experiments.
Classification (slides adapted from Rob Schapire) Eran Segal Weizmann Institute.
CZ5225: Modeling and Simulation in Biology Lecture 7, Microarray Class Classification by Machine learning Methods Prof. Chen Yu Zong Tel:
GENDER AND AGE RECOGNITION FOR VIDEO ANALYTICS SOLUTION PRESENTED BY: SUBHASH REDDY JOLAPURAM.
Feature Selction for SVMs J. Weston et al., NIPS 2000 오장민 (2000/01/04) Second reference : Mark A. Holl, Correlation-based Feature Selection for Machine.
SVMs in a Nutshell.
Computational Intelligence: Methods and Applications Lecture 15 Model selection and tradeoffs. Włodzisław Duch Dept. of Informatics, UMK Google: W Duch.
FUZZ-IEEE Kernel Machines and Additive Fuzzy Systems: Classification and Function Approximation Yixin Chen and James Z. Wang The Pennsylvania State.
Support Vector Machine (SVM) Presented by Robert Chen.
SUPPORT VECTOR MACHINES Presented by: Naman Fatehpuria Sumana Venkatesh.
Next, this study employed SVM to classify the emotion label for each EEG segment. The basic idea is to project input data onto a higher dimensional feature.
Classification of Breast Cancer Cells Using Artificial Neural Networks and Support Vector Machines Emmanuel Contreras Guzman.
Support Vector Machines (SVMs) Chapter 5 (Duda et al.) CS479/679 Pattern Recognition Dr. George Bebis.
1 C.A.L. Bailer-Jones. Machine Learning. Data exploration and dimensionality reduction Machine learning, pattern recognition and statistical data modelling.
Support Vector Machines
CS 9633 Machine Learning Support Vector Machines
PREDICT 422: Practical Machine Learning
Week 2 Presentation: Project 3
Evaluating Classifiers
Can Computer Algorithms Guess Your Age and Gender?
Support Vector Machines (SVM)
Final Year Project Presentation --- Magic Paint Face
CS 4/527: Artificial Intelligence
An Introduction to Support Vector Machines
LINEAR AND NON-LINEAR CLASSIFICATION USING SVM and KERNELS
Pawan Lingras and Cory Butz
Support Vector Machines
CS 2750: Machine Learning Line Fitting + Bias-Variance Trade-off
Hyperparameters, bias-variance tradeoff, validation
Learning Gender with Support Faces
COSC 4335: Other Classification Techniques
Implementing AdaBoost
Overfitting and Underfitting
Bias-variance Trade-off
Support Vector Machine I
COSC 4368 Machine Learning Organization
Linear Discrimination
Shih-Yang Su Virginia Tech
Derek Hoiem CS 598, Spring 2009 Jan 27, 2009
SVMs for Document Ranking
Introduction to Machine learning
Support Vector Machines 2
Age Classification: Linear Model
Advisor: Dr.vahidipour Zahra salimian Shaghayegh jalali Dec 2017
Outlines Introduction & Objectives Methodology & Workflow
Presentation transcript:

Project 4: Facial Image Analysis with Support Vector Machines Catherine Nansalo and Garrett Bingham

Outline Introduction to the Data Support Vector Machines Results FG-NET database Support Vector Machines Overview Kernels and other parameters Results Classifying Gender Predicting Age Conclusions Questions References

Introduction to the Data: FG-NET 1002 images 0-69 years old 6-18 images per person 82 people 47 males 35 females The FG-NET database is “...funded by the E.C.IST program. The objectives of FG-NET are to encourage development of a technology for face and gesture recognition.” (See http://www-prima.inrialpes.fr/FGnet/)

Support Vector Machine (SVM) Finding the hyperplane that best divides data into classes Support vectors are the points nearest to the hyperplane The distance between the hyperplane and the nearest data point from either set is known as the margin.

But what happens when there is no clear hyperplane? 2D higher dimension. Kernalling!

Linear Polynomial Radial

Results: Linear Kernel Gender cost = 1 Age cost = 100 Results: Linear Kernel

Results: Polynomial Kernel Useful when the boundary separating theclasses is not linear Optimal values for parameters are unknown, so a grid search tests different combinations and evaluates them using 5-fold cross validation. Degree = 3 offers the highest accuracy in gender prediction as well as the lowest error in age estimation.

Support Vector Machines: Polynomial Kernel The heatmaps show how different choices of C and Gamma affect model performance for gender classification and age Best Parameters (for both models) C = 0.01 Gamma = 10 Age Estimation Mean Absolute Error = 7.63 Gender Classification Prediction Accuracy = 0.70

Support Vector Machines: Radial Kernel C: controls bias-variance tradeoff Gamma: controls radius of influence of support vectors Too large: includes only support vector itself, no regularization with C will prevent overfitting. Too small: model is too constrained and can’t capture the complexity or “shape” of data. Intermediate: model performs equally well with large C. The radial kernel acts as a structural regularizer, so it is not crucial to limit the number of support vectors. Best values are found on the diagonal. Smooth models (lower gamma) are made more complex by selecting more support vectors (larger C).

5-fold Cross Validation Comparison: Radial Kernel Different methods of cross validation result in different recommended parameters. Not accounting for this could prevent our model from generalizing well to future data. By image Mean Accuracy: 0.81 C = 1000 Gamma = 0.01 By person Mean Accuracy: 0.76 Gamma = 0.001

Results: Classifying Gender Accuracy (5-fold CV) Accuracy (LOPO CV) Linear 0.715 0.730 Polynomial 0.682 0.719 Radial 0.723 0.742 SVM performs slightly better than other classifiers we have tested in the past.

Results: Predicting Age After tuning, the choice of kernel doesn’t seem to have a great affect on model performance. The Cumulative Score (CS) is the proportion of the images whose predicted ages were within j years of their actual age.

Results: Predicting Age MAE (5-fold CV) (LOPO CV) Linear 4.172666 3.415833 Polynomial 3.991509 4.845455 Radial 3.989279 3.578097 The Mean Absolute Error (MAE) is the average difference between predicted age and actual age for all images.

Conclusions Grid search is a useful tool to select optimal parameters for SVM These parameters are sensitive to the method of cross validation The method of cross validation can affect overall model performance Different kernels are better suited to different problems

Questions Does it matter whether the degree of the polynomial kernel is odd or even? What happens as the degree of the polynomial kernel grows large? Are there more efficient methods of parameter selection besides an exhaustive grid search?

References http://scikit-learn.org/stable/auto_examples/svm/plot_rbf_parameters.html Slide 3: https://www.researchgate.net/figure/220057621_fig1_Figure-1-Sample- images-from-the-FG-NET-Aging-database Slides 4-5: http://blog.aylien.com/support-vector-machines-for-dummies-a-simple/ Slide 6: http://perclass.com/doc/guide/classifiers/svm.html , http://www.eric- kim.net/eric-kim-net/posts/1/kernel_trick.html An Introduction to Statistical Learning with Applications in R by Gareth James, Daniela Witten, Trevor Hastie, and Robert Tibshirani.

Thank you!