An Overview of Machine Learning

Slides:



Advertisements
Similar presentations
Thomas Trappenberg Autonomous Robotics: Supervised and unsupervised learning.
Advertisements

1 Image Classification MSc Image Processing Assignment March 2003.
Unsupervised Learning Clustering K-Means. Recall: Key Components of Intelligent Agents Representation Language: Graph, Bayes Nets, Linear functions Inference.
Integrated Instance- and Class- based Generative Modeling for Text Classification Antti PuurulaUniversity of Waikato Sung-Hyon MyaengKAIST 5/12/2013 Australasian.
Machine learning continued Image source:
Supervised Learning Recap
Computer vision: models, learning and inference
Lecture 17: Supervised Learning Recap Machine Learning April 6, 2010.
Zhu, Rogers, Qian, Kalish Presented by Syeda Selina Akter.
Supervised and Unsupervised learning and application to Neuroscience Cours CA6b-4.
An Introduction to Kernel-Based Learning Algorithms K.-R. Muller, S. Mika, G. Ratsch, K. Tsuda and B. Scholkopf Presented by: Joanna Giforos CS8980: Topics.
Prénom Nom Document Analysis: Data Analysis and Clustering Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Optimal Adaptation for Statistical Classifiers Xiao Li.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
Learning Programs Danielle and Joseph Bennett (and Lorelei) 4 December 2007.
Review Rong Jin. Comparison of Different Classification Models  The goal of all classifiers Predicating class label y for an input x Estimate p(y|x)
Part I: Classification and Bayesian Learning
Introduction to machine learning
Machine Learning Usman Roshan Dept. of Computer Science NJIT.
Crash Course on Machine Learning
CEN 592 PATTERN RECOGNITION Spring Term CEN 592 PATTERN RECOGNITION Spring Term DEPARTMENT of INFORMATION TECHNOLOGIES Assoc. Prof.
This week: overview on pattern recognition (related to machine learning)
ECSE 6610 Pattern Recognition Professor Qiang Ji Spring, 2011.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
General Information Course Id: COSC6342 Machine Learning Time: TU/TH 10a-11:30a Instructor: Christoph F. Eick Classroom:AH123
Machine Learning in Spoken Language Processing Lecture 21 Spoken Language Processing Prof. Andrew Rosenberg.
COMMON EVALUATION FINAL PROJECT Vira Oleksyuk ECE 8110: Introduction to machine Learning and Pattern Recognition.
CPSC 502, Lecture 15Slide 1 Introduction to Artificial Intelligence (AI) Computer Science cpsc502, Lecture 16 Nov, 3, 2011 Slide credit: C. Conati, S.
Introduction to machine learning and data mining 1 iCSC2014, Juan López González, University of Oviedo Introduction to machine learning Juan López González.
1 SUPPORT VECTOR MACHINES İsmail GÜNEŞ. 2 What is SVM? A new generation learning system. A new generation learning system. Based on recent advances in.
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
Jun-Won Suh Intelligent Electronic Systems Human and Systems Engineering Department of Electrical and Computer Engineering Speaker Verification System.
Today Ensemble Methods. Recap of the course. Classifier Fusion
MACHINE LEARNING 8. Clustering. Motivation Based on E ALPAYDIN 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2  Classification problem:
Classification Derek Hoiem CS 598, Spring 2009 Jan 27, 2009.
Linear Models for Classification
INTRODUCTION TO MACHINE LEARNING 3RD EDITION ETHEM ALPAYDIN © The MIT Press, Lecture.
Data Mining Practical Machine Learning Tools and Techniques By I. H. Witten, E. Frank and M. A. Hall DM Finals Study Guide Rodney Nielsen.
Final Exam Review CS479/679 Pattern Recognition Dr. George Bebis 1.
Data Mining and Decision Support
NTU & MSRA Ming-Feng Tsai
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
General Information Course Id: COSC6342 Machine Learning Time: TU/TH 1-2:30p Instructor: Christoph F. Eick Classroom:AH301
PatReco: Introduction Alexandros Potamianos Dept of ECE, Tech. Univ. of Crete Fall
Part 3: Estimation of Parameters. Estimation of Parameters Most of the time, we have random samples but not the densities given. If the parametric form.
Machine Learning Usman Roshan Dept. of Computer Science NJIT.
CMPS 142/242 Review Section Fall 2011 Adapted from Lecture Slides.
Machine Learning Supervised Learning Classification and Regression K-Nearest Neighbor Classification Fisher’s Criteria & Linear Discriminant Analysis Perceptron:
Machine Learning Supervised Learning Classification and Regression K-Nearest Neighbor Classification Fisher’s Criteria & Linear Discriminant Analysis Perceptron:
1 C.A.L. Bailer-Jones. Machine Learning. Data exploration and dimensionality reduction Machine learning, pattern recognition and statistical data modelling.
Usman Roshan Dept. of Computer Science NJIT
Brief Intro to Machine Learning CS539
Combining Models Foundations of Algorithms and Machine Learning (CS60020), IIT KGP, 2017: Indrajit Bhattacharya.
CSE 4705 Artificial Intelligence
Data Mining, Machine Learning, Data Analysis, etc. scikit-learn
Machine Learning with Spark MLlib
Who am I? Work in Probabilistic Machine Learning Like to teach 
Machine Learning Models
Machine Learning for dotNET Developer Bahrudin Hrnjica, MVP
COMP61011 : Machine Learning Ensemble Models
Course Outline MODEL INFORMATION COMPLETE INCOMPLETE
Machine Learning Ensemble Learning: Voting, Boosting(Adaboost)
Data Mining, Machine Learning, Data Analysis, etc. scikit-learn
Data Mining, Machine Learning, Data Analysis, etc. scikit-learn
Linear Discrimination
Machine Learning – a Probabilistic Perspective
Overview of Machine Learning
Derek Hoiem CS 598, Spring 2009 Jan 27, 2009
CAMCOS Report Day December 9th, 2015 San Jose State University
What is Artificial Intelligence?
Presentation transcript:

An Overview of Machine Learning Speaker: Yi-Fan Chang Adviser: Prof. J. J. Ding Date: 2011/10/21

Outline & Content What is machine learning? Learning system model Training and testing Performance Algorithms Machine learning structure What are we seeking? Learning techniques Applications Conclusion

What is machine learning? A branch of artificial intelligence, concerned with the design and development of algorithms that allow computers to evolve behaviors based on empirical data. As intelligence requires knowledge, it is necessary for the computers to acquire knowledge.

Learning system model Testing Input Samples Learning Method System Training

Training and testing Data acquisition Practical usage Universal set (unobserved) Training set (observed) Testing set (unobserved)

Training and testing Training is the process of making the system able to learn. No free lunch rule: Training set and testing set come from the same distribution Need to make some assumptions or bias

Performance There are several factors affecting the performance: Types of training provided The form and extent of any initial background knowledge The type of feedback provided The learning algorithms used Two important factors: Modeling Optimization

Algorithms The success of machine learning system also depends on the algorithms.  The algorithms control the search to find and build the knowledge structures. The learning algorithms should extract useful information from training examples.

Algorithms Supervised learning ( ) Unsupervised learning ( ) Prediction Classification (discrete labels), Regression (real values) Unsupervised learning ( ) Clustering Probability distribution estimation Finding association (in features) Dimension reduction Semi-supervised learning Reinforcement learning Decision making (robot, chess machine)

Algorithms Supervised learning Unsupervised learning Semi-supervised learning

Machine learning structure Supervised learning

Machine learning structure Unsupervised learning

What are we seeking? Supervised: Low E-out or maximize probabilistic terms Unsupervised: Minimum quantization error, Minimum distance, MAP, MLE(maximum likelihood estimation) E-in: for training set E-out: for testing set

What are we seeking? Under-fitting VS. Over-fitting (fixed N) error (model = hypothesis + loss functions)

Learning techniques Supervised learning categories and techniques Linear classifier (numerical functions) Parametric (Probabilistic functions) Naïve Bayes, Gaussian discriminant analysis (GDA), Hidden Markov models (HMM), Probabilistic graphical models Non-parametric (Instance-based functions) K-nearest neighbors, Kernel regression, Kernel density estimation, Local regression Non-metric (Symbolic functions) Classification and regression tree (CART), decision tree Aggregation Bagging (bootstrap + aggregation), Adaboost, Random forest

Learning techniques Linear classifier Perceptron Logistic regression Support vector machine (SVM) Ada-line Multi-layer perceptron (MLP) Linear classifier , where w is an d-dim vector (learned)

Learning techniques Using perceptron learning algorithm(PLA) Training Testing Error rate: 0.10 Error rate: 0.156

Learning techniques Using logistic regression Training Testing Error rate: 0.11 Error rate: 0.145

Learning techniques Non-linear case Support vector machine (SVM): Linear to nonlinear: Feature transform and kernel function

Learning techniques Unsupervised learning categories and techniques Clustering K-means clustering Spectral clustering Density Estimation Gaussian mixture model (GMM) Graphical models Dimensionality reduction Principal component analysis (PCA) Factor analysis

Applications Face detection Object detection and recognition Image segmentation Multimedia event detection Economical and commercial usage

Conclusion We have a simple overview of some techniques and algorithms in machine learning. Furthermore, there are more and more techniques apply machine learning as a solution. In the future, machine learning will play an important role in our daily life.

Reference [1] W. L. Chao, J. J. Ding, “Integrated Machine Learning Algorithms for Human Age Estimation”, NTU, 2011.