Presentation is loading. Please wait.

Presentation is loading. Please wait.

AGE ESTIMATION: A CLASSIFICATION PROBLEM HANDE ALEMDAR, BERNA ALTINEL, NEŞE ALYÜZ, SERHAN DANİŞ.

Similar presentations


Presentation on theme: "AGE ESTIMATION: A CLASSIFICATION PROBLEM HANDE ALEMDAR, BERNA ALTINEL, NEŞE ALYÜZ, SERHAN DANİŞ."— Presentation transcript:

1 AGE ESTIMATION: A CLASSIFICATION PROBLEM HANDE ALEMDAR, BERNA ALTINEL, NEŞE ALYÜZ, SERHAN DANİŞ

2 Project Overview

3 Subset Overview Aging Subset of Bosphorus Database: – 1-4 neutral and frontal 2D images of subjects – 105 subjects – Total of 298 scans – Age range: [18-60] – Age distribution non uniform: average = 29.9

4 Project Overview Aging images of individuals is not present Aim: Age Estimation based on Age Classes 3 Classes: Age 96 samples 26 161 samples Age>36 -> 41 samples

5 Preprocessing Registration Cropping Histogram Equalization Resizing

6 SUBSPACE ANALYSIS FOR AGE ESTIMATION Neşe Alyüz

7 Age Manifold Instead of learning a subject-specific aging pattern, a common aging trend can be learned Manifold embedding technique to learn the low-dimensional aging trend. Image space: Labels: Low-dim. representation: d<<D Mapping:

8 Orthogonal Locality Preserving Projections - OLPP Subspace learning technique Produces orthogonal basis functions on LPP LPP: The essential manifold structure preserved by measuring local neighborhood distances OLPP vs. PCA for age manifold: OLPP is supervised, PCA is unsupervised OLPP better, since age labeling is used for learning X Size of training data for OLPP should be LARGE enough

9 Locality Preserving Projection - LPP aka: Laplacianface Approach Linear dimensionality reduction algorithm Builds a graph: based on neighborhood information Obtains a linear transformation: Neighborhood information is preserved

10 LPP S: similarity matrix defined on data points (weights) L = D – S : graph Laplacian D: diagonal sum matrix of S measures local density around a sample point Minimization problem: with the constraint : => Minimizing this function: ensure that if x i and x j are close then their projections y i and y j are also close

11 LPP Generalized eigenvalue problem: Basis functions are the eigenvectors of: Not symmetric, therefore the basis functions are not orthogonal

12 OLPP In LPP, basis functions are nonorthogonal – > reconstruction is difficult OLPP produces orthogonal basis functions – > has more locality preserving power

13 OLPP – Algorithmic Outline (1) Preprocessing: PCA projection (2) Constructing the Adjacency Graph (3) Choosing the Locality Weights (4) Computing the Orthogonal Basis Functions (5) OLPP Embedding

14 (1) Preprocessing: PCA Prjection XDX T can be singular To overcome the singularity problem -> PCA Throwing away components, whose corresponding eigenvalues are zero. Transformation matrix: W PCA Extracted features become statistically uncorrelated

15 (2) Constructing The Adjacency Graph G: a graph with n nodes If face images x i and x j are connected (has the same label) then an edge exists in-between.

16 (3) Choosing the Locality Weights S: weight matrix If node i and j are connected: Weights: heat kernel function Models the local structure of the manifold

17 (4) Computing the Orthogonal Basis Functions D: diagonal matrix, column sum of S L : laplacian matrix, L = D – S Orthogonal basis vectors: Two extra matrices defined: Computing the basis vectors: – Compute a 1 : eigenvector of with the greatest eigenvalue – Compute a k : eigenvector of with the greatest eigenvalue

18 (5) OLPP Embedding Let: Overall embedding:

19 Subspace Methods: PCA vs. OLPP Face Recognition Results on ORL

20 Subspace Methods: PCA vs. OLPP Face Recognition Results on Aging Subset of the Bosphorus Database Age Estimation (Classification) Results on Aging Subset of the Bosphorus Database

21 FEATURE EXTRACTION: LOCAL BINARY PATTERNS Hande Alemdar

22 Feature Extraction LBP - Local Binary Patterns

23 Local Binary Patterns More formally For 3x3 neighborhood we have 256 patterns Feature vector size = 256 where

24 Uniform LBP Uniform patterns can be used to reduce the length of the feature vector and implement a simple rotation-invariant descriptor If the binary pattern contains at most two bitwise transitions from 0 to 1 or vice versa when the bit pattern is traversed circularly  Uniform – 01110000 is uniform – 00111000 (2 transitions) – 00011100 (2 transitions) For 3x3 neighborhood we have 58 uniform patterns Feature vector size = 59

25 FEATURE EXTRACTION: GABOR FILTERING Serhan Daniş

26 Gabor Filter Band-pass filters used for feature extraction, texture analysis and stereo disparity estimation. Can be designed for a number of dilations and rotations. The filters with various dilations and rotations are convolved with the signal, resulting in a so-called Gabor space. This process is closely related to processes in the primary visual cortex.

27 Gabor Filter  A set of Gabor filters with different frequencies and orientations may be helpful for extracting useful features from an image.  We used 6 different rotations and 4 different scales on 16 overlapping patches of the images.  We generate 768 features for each image.

28 CLASSIFICATION Berna Altınel

29 EXPERIMENTAL DATASETS 1. FEATURES_50_45(LBP) 2. FEATURES_100_90(LBP) 3. FEATURES_ORIG(LBP) 4. FEATURES_50_45(GABOR) 5. FEATURES_100_90 (GABOR)

30 Estimate age, just based on the average value of the training set Experiment #1

31 EXPERIMENTAL RESULTS: INPUTMETHODMean(MRE) %Number of Correct classifications Number of miss classifications Features_ 50_45 -LBP Estimating the Average Age 17.31162 / 298135 / 298 Features_ 100_90 -LBP Features_orig -LBP Features_ 50_45 (GABOR) Features_ 100_90 (GABOR)

32 K-NEAREST-NEIGHBOR ALGORITHM Experiments #2 The K-nearest-neighbor (KNN) algorithm measures the distance between a query scenario and a set of scenarios in the data set.

33 EXPERIMENTAL RESULTS: INPUTFeatures_ 50_45 (LBP) Features_ 100_90 (LBP) Features_ orig (LBP) Features_ 50_45 (GABOR) Features_ 100_90 (GABOR) METHOD kNN-1(Euc Dist) MRE(%):5.05MRE(%):4.14MRE(%):11.1 1 MRE(%):3.88MRE(%):3.75 kNN-2(Euc Dist) MRE(%):6.77MRE(%):5.17MRE(%):11.9 7 MRE(%):4.92MRE(%):5.08 kNN-3(Euc Dist) MRE(%):7.50MRE(%):6.06MRE(%):12.5 0 MRE(%):5.79MRE(%):5.96 kNN-5(Euc Dist) MRE(%):10.4 0 MRE(%):9.36MRE(%):13.1 5 MRE(%):11.0 2 MRE(%):10.9 3 kNN-10(Euc Dist) MRE(%):12.3 4 MRE(%):11.5 7 MRE(%):14.1 6 MRE(%):13.8 6 MRE(%):14.1 3 kNN-15(Euc Dist) MRE(%):12.8 5 MRE(%):12.3 0 MRE(%):14.3 5 MRE(%):14.5 7 MRE(%):14.9 8

34 Features_ 50_45 (LBP) Features_ 100_90 (LBP) Features_ orig (LBP) Features_ 50_45 (GABOR) Features_ 100_90 (GABOR) METHOD AverageMRE(%): 15.72 MRE(%): 15.01 MRE(%): 15.58 MRE(%): 16.31 MRE(%): 16.55 MissClass: 35 / 298 MissClass: 32 / 298 MissClass: 31 / 298 MissClass: 32 / 298 MissClass: 27 / 298 CorrectClas s:262 / 298 CorrectClas s: 265 / 298 CorrectClas s: 266 / 298 CorrectClas s:265 / 298 CorrectClas s:270 / 298

35 IN PROGRESS: 1. Parametric Classification 2. Mahalanobis distance can be used as the distance measure in kNN. [ 2 [ 2 [ 2 [

36 1. Other distance functions can be analyzed for kNN: 2. Normalization can be applied: POSSIBLE FUTURE WORK ITEMS:


Download ppt "AGE ESTIMATION: A CLASSIFICATION PROBLEM HANDE ALEMDAR, BERNA ALTINEL, NEŞE ALYÜZ, SERHAN DANİŞ."

Similar presentations


Ads by Google