2111 2005 Ole Mathis Kruse, IMT Feature extraction techniques to use in cereal classification Department of Mathematical Sciences and Technology 1.

Slides:



Advertisements
Similar presentations
Face Recognition Sumitha Balasuriya.
Advertisements

1 CPC group SeminarThursday, June 1, 2006 Classification techniques for Hand-Written Digit Recognition Venkat Raghavan N. S., Saneej B. C., and Karteek.
IDEAL 2005, 6-8 July, Brisbane Multiresolution Analysis of Connectivity Atul Sajjanhar, Deakin University, Australia Guojun Lu, Monash University, Australia.
S-SENCE Signal processing for chemical sensors Martin Holmberg S-SENCE Applied Physics, Department of Physics and Measurement Technology (IFM) Linköping.
São Paulo Advanced School of Computing (SP-ASC’10). São Paulo, Brazil, July 12-17, 2010 Looking at People Using Partial Least Squares William Robson Schwartz.
Lecture 19 Singular Value Decomposition
Principal Component Analysis CMPUT 466/551 Nilanjan Ray.
Principal Component Analysis
Principal component analysis (PCA)
Data mining and statistical learning, lecture 4 Outline Regression on a large number of correlated inputs  A few comments about shrinkage methods, such.
CALIBRATION Prof.Dr.Cevdet Demir
Canonical correlations
Dimension Reduction and Feature Selection Craig A. Struble, Ph.D. Department of Mathematics, Statistics, and Computer Science Marquette University.
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Dimension reduction : PCA and Clustering Christopher Workman Center for Biological Sequence Analysis DTU.
Ordinary least squares regression (OLS)
Principal component analysis (PCA) Purpose of PCA Covariance and correlation matrices PCA using eigenvalues PCA using singular value decompositions Selection.
Linear and generalised linear models Purpose of linear models Least-squares solution for linear models Analysis of diagnostics Exponential family and generalised.
Slide 1 EE3J2 Data Mining EE3J2 Data Mining Lecture 9 Data Analysis Martin Russell.
Biomedical Image Analysis and Machine Learning BMI 731 Winter 2005 Kun Huang Department of Biomedical Informatics Ohio State University.
Modern Navigation Thomas Herring
Multivariate Data and Matrix Algebra Review BMTRY 726 Spring 2012.
Principles of the Global Positioning System Lecture 11 Prof. Thomas Herring Room A;
Linear Algebra and Image Processing
1 Statistical Tools for Multivariate Six Sigma Dr. Neil W. Polhemus CTO & Director of Development StatPoint, Inc.
Chapter 2 Dimensionality Reduction. Linear Methods
CSE554AlignmentSlide 1 CSE 554 Lecture 8: Alignment Fall 2014.
Presented By Wanchen Lu 2/25/2013
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Example Clustered Transformations MAP Adaptation Resources: ECE 7000:
Feature extraction 1.Introduction 2.T-test 3.Signal Noise Ratio (SNR) 4.Linear Correlation Coefficient (LCC) 5.Principle component analysis (PCA) 6.Linear.
Brain Mapping Unit The General Linear Model A Basic Introduction Roger Tait
Texture. Texture is an innate property of all surfaces (clouds, trees, bricks, hair etc…). It refers to visual patterns of homogeneity and does not result.
BIOL 582 Supplemental Material Matrices, Matrix calculations, GLM using matrix algebra.
Classification Course web page: vision.cis.udel.edu/~cv May 12, 2003  Lecture 33.
Factor Analysis Psy 524 Ainsworth. Assumptions Assumes reliable correlations Highly affected by missing data, outlying cases and truncated data Data screening.
N– variate Gaussian. Some important characteristics: 1)The pdf of n jointly Gaussian R.V.’s is completely described by means, variances and covariances.
ECE 8443 – Pattern Recognition LECTURE 10: HETEROSCEDASTIC LINEAR DISCRIMINANT ANALYSIS AND INDEPENDENT COMPONENT ANALYSIS Objectives: Generalization of.
SINGULAR VALUE DECOMPOSITION (SVD)
Face Recognition: An Introduction
CSE554AlignmentSlide 1 CSE 554 Lecture 8: Alignment Fall 2013.
Chapter 7 Multivariate techniques with text Parallel embedded system design lab 이청용.
Linear Discriminant Analysis (LDA). Goal To classify observations into 2 or more groups based on k discriminant functions (Dependent variable Y is categorical.
A NOVEL METHOD FOR COLOR FACE RECOGNITION USING KNN CLASSIFIER
March 7, 2006Lecture 8aSlide #1 Matrix Algebra, or: Is this torture really necessary?! What for? –Permits compact, intuitive depiction of regression analysis.
EIGENSYSTEMS, SVD, PCA Big Data Seminar, Dedi Gadot, December 14 th, 2014.
Principal Component Analysis (PCA)
Matrix Factorization & Singular Value Decomposition Bamshad Mobasher DePaul University Bamshad Mobasher DePaul University.
Intro. ANN & Fuzzy Systems Lecture 16. Classification (II): Practical Considerations.
Multivariate Transformation. Multivariate Transformations  Started in statistics of psychology and sociology.  Also called multivariate analyses and.
1 Statistics & R, TiP, 2011/12 Multivariate Methods  Multivariate data  Data display  Principal component analysis Unsupervised learning technique 
Multivariate statistical methods. Multivariate methods multivariate dataset – group of n objects, m variables (as a rule n>m, if possible). confirmation.
Strategies for Metabolomic Data Analysis Dmitry Grapov, PhD.
Unsupervised Learning II Feature Extraction
Part 3: Estimation of Parameters. Estimation of Parameters Most of the time, we have random samples but not the densities given. If the parametric form.
CSE 554 Lecture 8: Alignment
PREDICT 422: Practical Machine Learning
LECTURE 11: Advanced Discriminant Analysis
Background on Classification
Introduction to Data Mining
Can Computer Algorithms Guess Your Age and Gender?
CH 5: Multivariate Methods
Lecture 8:Eigenfaces and Shared Features
Principal Components Analysis
Computer Vision Lecture 16: Texture II
Principal Component Analysis
Introduction PCA (Principal Component Analysis) Characteristics:
Word Embedding Word2Vec.
Multivariate Methods Berlin Chen
Multivariate Methods Berlin Chen, 2005 References:
Lecture 16. Classification (II): Practical Considerations
Presentation transcript:

Ole Mathis Kruse, IMT Feature extraction techniques to use in cereal classification Department of Mathematical Sciences and Technology 1

NORWEGIAN UNIVERSITY OF LIFE SCIENCES Problem  Is it possible to discriminate between different species- or varieties of cereal grains - using image analysis? Department of Mathematical Sciences and Technology 2 BarleyOatWheat Wheat - MjølnerWheat - Bjørke

NORWEGIAN UNIVERSITY OF LIFE SCIENCES The images  Images are taken with a high resolution (1280 x 1024) digital camera.  Three images (rotated 120°) of each sample.  Each image splitted in four –Larger data material. –Easy to implement cross-validation.  204 images for species analysis.  84 images for variety analysis. Department of Mathematical Sciences and Technology 3

NORWEGIAN UNIVERSITY OF LIFE SCIENCES Texture analysis  How to discriminate the samples?  Single grain analysis –Measure size, roundness, colour etc. for each grain. –Extraction of singe grain is difficult.  Texture analysis –Finds features for the texture of the image surface. –Easy to implement. –Several different features available. Department of Mathematical Sciences and Technology 4

NORWEGIAN UNIVERSITY OF LIFE SCIENCES Feature detectors  Some examples of feature detectors –Angle Measure Technique (AMT) Images are folded out to an intensity vector. A circle is placed at random points on the vector, and the angle between the intersections are calculated. Parameters are calculated from these angles. –Histogram statistics Several (~10) statistical parameters are calculated for each image. Some of these parameters are based on the intensity distribution of the images. Department of Mathematical Sciences and Technology 5 BarleyOat From: Esbensen, Hjelmen & Kvaal (1996)

NORWEGIAN UNIVERSITY OF LIFE SCIENCES Feature detectors continued –Gray Level Co-occurence Matrix (GLCM) Builds a new matrix that counts the number of different neighbouring relations. Four statistical parameters are calculated from the GLCM matrix. –Singular Value Decomposition (SVD) Department of Mathematical Sciences and Technology 6 –Singular Value Decomposition (SVD) From Linear Algebra, a matrix M (e.g an image) can be factorized to the form M = USV*. U, S and V* are matrices which capture different characteristics of the image. We only use the S matrix, which has nonzero values only on the main diagonal. These nonzero values are the singular values of M and can be thought of as scalar “gain controls.” –There are lots of other methods, and also variants of the methods shown here.

NORWEGIAN UNIVERSITY OF LIFE SCIENCES Output from feature detectors  All methods give a matrix that ranges from 4 to 1000’s of columns with one row for each image  A complicated dataset with many variables  Difficult to analyse with univariate statistics (e.g. ANOVA)  Must use multivariate techniques Department of Mathematical Sciences and Technology 7 V1V1 V2V2 V3V3 V4V4 V5V5 V6V6 …. VkVk Image 1 Image 2 Image 3 …. Image n

NORWEGIAN UNIVERSITY OF LIFE SCIENCES Multivariate statistics  Multivariate statistics analyses all variables at the same time and finds patterns that describe the variability  Covariance between variables is taken into account Automated identification of healthy and damaged tissue using imaging techniques 8  Several different multivariate statistical methods - Principal Component Analysis (PCA) - Partial Least Square Discriminant Analysis (PLSDA) Fra:

NORWEGIAN UNIVERSITY OF LIFE SCIENCES Principal Component Analysis (PCA)  Typical result from PCA analysis Department of Mathematical Sciences and Technology 9

NORWEGIAN UNIVERSITY OF LIFE SCIENCES Partial Least Square Discriminant Analysis (PLSDA)  PLS is a multivariate regression technique  PLSDA allows discrimination/classification of the data Department of Mathematical Sciences and Technology 10

NORWEGIAN UNIVERSITY OF LIFE SCIENCES Classification results from different feature detectors Department of Mathematical Sciences and Technology 11 BarleyOatWheat AMT Histogram statistics GLCM SVD 100  Compare the PLSDA classification with the known species/variety.  Calculate % correct classification for each species/variety for the different feature detectors. BjørkeMagnifikMjølnerOlivinPolka AMT Histogram statistics GLCM SVD

NORWEGIAN UNIVERSITY OF LIFE SCIENCES Summary  Using 12 images of each sample gives better data quality.  Feature detectors are used to extract texture information from the images.  Multivariate statistics are used to analyse the feature data.  The cereal grain species are classified quite well.  SVD is the best detector.  Classification of wheat is strongly dependent on the varieties and the different feature detectors. Department of Mathematical Sciences and Technology 12