High resolution product by SVM. L’Aquila experience and prospects for the validation site R. Anniballe DIET- Sapienza University of Rome.

Slides:



Advertisements
Similar presentations
ECG Signal processing (2)
Advertisements

Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
Classification / Regression Support Vector Machines
Pattern Recognition and Machine Learning
An Introduction of Support Vector Machine
Support Vector Machines
SVM—Support Vector Machines
Support vector machine
Search Engines Information Retrieval in Practice All slides ©Addison Wesley, 2008.
Machine learning continued Image source:
Software Quality Ranking: Bringing Order to Software Modules in Testing Fei Xing Michael R. Lyu Ping Guo.
Classification and Decision Boundaries
Support Vector Machines (and Kernel Methods in general)
Prénom Nom Document Analysis: Linear Discrimination Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
1 Classification: Definition Given a collection of records (training set ) Each record contains a set of attributes, one of the attributes is the class.
Support Vector Machine (SVM) Classification
Lecture outline Support vector machines. Support Vector Machines Find a linear hyperplane (decision boundary) that will separate the data.
SVM Support Vectors Machines
Support Vector Machines
What is Learning All about ?  Get knowledge of by study, experience, or being taught  Become aware by information or from observation  Commit to memory.
Lecture 10: Support Vector Machines
SVM (Support Vector Machines) Base on statistical learning theory choose the kernel before the learning process.
Greg GrudicIntro AI1 Support Vector Machine (SVM) Classification Greg Grudic.
SVMs, cont’d Intro to Bayesian learning. Quadratic programming Problems of the form Minimize: Subject to: are called “quadratic programming” problems.
Ch. Eick: Support Vector Machines: The Main Ideas Reading Material Support Vector Machines: 1.Textbook 2. First 3 columns of Smola/Schönkopf article on.
Machine Learning CUNY Graduate Center Lecture 3: Linear Regression.
Linear hyperplanes as classifiers Usman Roshan. Hyperplane separators.
This week: overview on pattern recognition (related to machine learning)
1 SUPPORT VECTOR MACHINES İsmail GÜNEŞ. 2 What is SVM? A new generation learning system. A new generation learning system. Based on recent advances in.
Kernel Methods A B M Shawkat Ali 1 2 Data Mining ¤ DM or KDD (Knowledge Discovery in Databases) Extracting previously unknown, valid, and actionable.
SVM Support Vector Machines Presented by: Anas Assiri Supervisor Prof. Dr. Mohamed Batouche.
Support Vector Machines Reading: Ben-Hur and Weston, “A User’s Guide to Support Vector Machines” (linked from class web page)
CS Statistical Machine learning Lecture 18 Yuan (Alan) Qi Purdue CS Oct
Jun-Won Suh Intelligent Electronic Systems Human and Systems Engineering Department of Electrical and Computer Engineering Speaker Verification System.
Computational Intelligence: Methods and Applications Lecture 23 Logistic discrimination and support vectors Włodzisław Duch Dept. of Informatics, UMK Google:
Human Detection Mikel Rodriguez. Organization 1. Moving Target Indicator (MTI) Background models Background models Moving region detection Moving region.
Kernels in Pattern Recognition. A Langur - Baboon Binary Problem m/2006/ /himplu s4.jpg … HA HA HA …
IMPROVING ACTIVE LEARNING METHODS USING SPATIAL INFORMATION IGARSS 2011 Edoardo Pasolli Univ. of Trento, Italy Farid Melgani Univ.
Kernel Methods: Support Vector Machines Maximum Margin Classifiers and Support Vector Machines.
Ohad Hageby IDC Support Vector Machines & Kernel Machines IP Seminar 2008 IDC Herzliya.
Linear hyperplanes as classifiers Usman Roshan. Hyperplane separators.
Linear Models for Classification
CS558 Project Local SVM Classification based on triangulation (on the plane) Glenn Fung.
CSE4334/5334 DATA MINING CSE4334/5334 Data Mining, Fall 2014 Department of Computer Science and Engineering, University of Texas at Arlington Chengkai.
1  Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
Support Vector Machines. Notation Assume a binary classification problem. –Instances are represented by vector x   n. –Training examples: x = (x 1,
Final Exam Review CS479/679 Pattern Recognition Dr. George Bebis 1.
Support Vector Machines (SVM): A Tool for Machine Learning Yixin Chen Ph.D Candidate, CSE 1/10/2002.
Dimensionality reduction
Support-Vector Networks C Cortes and V Vapnik (Tue) Computational Models of Intelligence Joon Shik Kim.
6.S093 Visual Recognition through Machine Learning Competition Image by kirkh.deviantart.com Joseph Lim and Aditya Khosla Acknowledgment: Many slides from.
Support Vector Machines Reading: Ben-Hur and Weston, “A User’s Guide to Support Vector Machines” (linked from class web page)
Greg GrudicIntro AI1 Support Vector Machine (SVM) Classification Greg Grudic.
Kernel Methods: Support Vector Machines Maximum Margin Classifiers and Support Vector Machines.
SVMs in a Nutshell.
Giansalvo EXIN Cirrincione unit #4 Single-layer networks They directly compute linear discriminant functions using the TS without need of determining.
Support Vector Machine: An Introduction. (C) by Yu Hen Hu 2 Linear Hyper-plane Classifier For x in the side of o : w T x + b  0; d = +1; For.
SUPPORT VECTOR MACHINES Presented by: Naman Fatehpuria Sumana Venkatesh.
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
1 Kernel Machines A relatively new learning methodology (1992) derived from statistical learning theory. Became famous when it gave accuracy comparable.
Support Vector Machines Reading: Textbook, Chapter 5 Ben-Hur and Weston, A User’s Guide to Support Vector Machines (linked from class web page)
PREDICT 422: Practical Machine Learning
An Introduction to Support Vector Machines
Pawan Lingras and Cory Butz
Support Vector Machines Introduction to Data Mining, 2nd Edition by
Pattern Recognition CS479/679 Pattern Recognition Dr. George Bebis
COSC 4335: Other Classification Techniques
Machine Learning Week 3.
Support Vector Machines
Concave Minimization for Support Vector Machine Classifiers
Presentation transcript:

High resolution product by SVM. L’Aquila experience and prospects for the validation site R. Anniballe DIET- Sapienza University of Rome

Outline SVM theory main concepts Data Fusion by SVMs Data Fusion by SVMs: L’Aquila Test case Methodologies Results Prospects for the validation site

SVM for data classification A Support Vector Machine (SVM) is a supervised machine learning algorithm developed for solving binary classification problems. During the training phase, a SVM maps the input patterns into a higher dimension feature space finds an optimal hyperplane for separating patterns belonging to different classes in the high dimensional feature space. In the test phase, unknown samples are mapped into the higher dimensional feature space classified based on their position wrt the optimal hyperplane.

The Optimal Separating Hyperplane A hyperplane in the feature space is described by the following equation: w ∙ Φ (x)+ b= 0 where: Φ : ℝ d → ℝ H is a vector function which maps the d-dimensional input vector x into a H-dimensional (with H>d) feature space, w  ℝ H is a vector perpendicular to the hyperplane and b is a bias term. The SVM algorithm determines the hyperplane that: minimize the training error (measured through the sum of the slack variables ξ ) separates the rest of the elements with the maximum margin. Among all the hyperplanes separating training samples without error, the SVM algorithm determines the one that maximizes the margin, i.e.the distance between the hyperplane and the closest training vectors of each class. Given a set of N labeled training sample, the SVM algorithm determines the Optimal Hyperplane as solution of the constrained optimization problem : Training samples not Linearly separable in the feature space

Data Fusion by SVMs In order to integrate information coming from different data sources by SVMs mainly two approaches can be used: 1.Feature Level Data fusion 2.Decision Level Data fusion Data Source 2 Feature Extraction Data Source 1 Data Source N SVM Input Space SVM Final class Feature Extraction A unique SVM classifies data based on a input space generated combining features from different data source in a unique feature vector. 1.Feature Level Data Fusion

Data Fusion by SVMs 2.Decision Level Data Fusion Feature Extraction Data Source 1 SVM 1 f 1 (x) SVM 1 Input Space Feature Extraction Data Source 2 SVM 2 f 2 (x) SVM 2 Input Space Feature Extraction Data Source N SVM 3 f N (x) SVM N Input Space Decision fusion Final class Distinct SVMs used for independently classifying each dataset Resulting rule images, representing the distance of the sample from the optimal hyperplane, combined based on a decision fusion strategy: Using an additional SVM Keeping the final decision based on the SVM which provide the rule image with the maximum absolute value.

Data Fusion by SVMs: L’Aquila test case Input Data: features from Optical, Geotechnical and Structural Modules OPTICAL FEATURES MIpanKLDpanMIpshKLDpsh ∆Contrast∆Correlation∆Energy∆Homogeneity∆Entropy ∆Hue∆Saturation∆IntensityDifference GEOTECHNICAL FEATURES soil resonant period at the building site (Tsoil) STRUCTURAL FEATURES Building HeightEarthquake Resistant Design (ERD) EMS98 Vulnerability Class 13 features 1 feature 3 features

Data Fusion by SVMs: L’Aquila test case Data Integration approaches 1.Features level: features from Optical (13), Geotechnical (1) and Structural (3) Modules combined in a unique feature vector used as input of a SVM. Two SVMs for independently classifying OPTICAL data and information from both STRUCTURAL and GEOTECHNICAL modules. An additional SVM used for integrating the resulting rules images 2.Decision level: Building having Structural and/or Geotechnical features missing classified using a SVM trained with only OPTICAL features, for both the integration strategies.

Data Fusion by SVMs: L’Aquila test case Results Classification performances assessed by a K-fold cross validation approach with K=10 Confusion Matrices Optical features SVM Data Fusion Multisource feature vector SVM Rule fusion SVM aDetected collapsed bFalse alarms cMisdetections dDetected uncollapsed performances Cohen's kappa 33,86%38,05%37,20% Normalized Cohen's kappa 36,44%35,15%40,50% Overall Accuracy 93,92%95,11% 94,2% Feature level data integration: decreasing of the false alarm rate achieved at the expense of a slight reduction of the sensitivity to damage Decision level data integration: the results are not very much different with respect to those obtained using only optical features. However, both kappa coefficient and its normalized version increase due to a better sensitivity to damage (more detections) and a slight reduction of the false alarm rate.

Prospects for the validation site SVMs trained on the L’Aquila data set could be tested on the validation site From an operational point of view, It could be interesting to investigate the use of Unsupervised approaches, such as Support Vector based Clustering algorithms Semi – Supervised Support Vector Machine

The Optimal Separating Hyperplane Training samples linearly separable in the feature space Training samples not linearly separable in the feature space The SVM algorithm determines the hyperplane that: minimize the training error, measured through the sum of the slack variables ξ separates the rest of the elements with the maximum margin. Among all the separating hyperplanes, the SVM algorithm determines the one that maximizes the margin, i.e the distance between the hyperplane and the closest training vectors of each class.