[slides prises du cours cs294-10 UC Berkeley (2006 / 2009)]

Slides:



Advertisements
Similar presentations
Projects Data Representation Basic testing and evaluation schemes
Advertisements

Florida International University COP 4770 Introduction of Weka.
Machine Learning & Data Mining CS/CNS/EE 155 Lecture 3: Regularization, Sparsity & Lasso 1.
Regression “A new perspective on freedom” TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAA A A A A AAA A A.
Lecture Notes for E Alpaydın 2010 Introduction to Machine Learning 2e © The MIT Press (V1.0) ETHEM ALPAYDIN © The MIT Press, 2010
CHAPTER 2: Supervised Learning. Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2 Learning a Class from Examples.
D ON ’ T G ET K ICKED – M ACHINE L EARNING P REDICTIONS FOR C AR B UYING Albert Ho, Robert Romano, Xin Alice Wu – Department of Mechanical Engineering,
Linear Classifiers/SVMs
1 Tree Based Data mining Techniques Haiqin Yang. 2 Why Data Mining? - Necessity is the Mother of Invention  The amount of data increases  Need to convert.
Weka & Rapid Miner Tutorial By Chibuike Muoh. WEKA:: Introduction A collection of open source ML algorithms – pre-processing – classifiers – clustering.
K-NEAREST NEIGHBORS AND DECISION TREE Nonparametric Supervised Learning.
Introduction to Ensemble Learning Featuring Successes in the Netflix Prize Competition Todd Holloway Two Lecture Series for B551 November 20 & 27, 2007.
CART: Classification and Regression Trees Chris Franck LISA Short Course March 26, 2013.
SVM—Support Vector Machines
Evidence Contrary to the Statistical View of Boosting David Mease & Abraham Wyner.
2 1 Discrete Markov Processes (Markov Chains) 3 1 First-Order Markov Models.
Computational Intelligence: Methods and Applications Lecture 1 Organization and overview Włodzisław Duch Dept. of Informatics, UMK Google: W Duch.
Algorithm-Independent Machine Learning Anna Egorova-Förster University of Lugano Pattern Classification Reading Group, January 2007 All materials in these.
CII504 Intelligent Engine © 2005 Irfan Subakti Department of Informatics Institute Technology of Sepuluh Nopember Surabaya - Indonesia.
Measuring Model Complexity (Textbook, Sections ) CS 410/510 Thurs. April 27, 2007 Given two hypotheses (models) that correctly classify the training.
Department of Computer Science, University of Waikato, New Zealand Eibe Frank WEKA: A Machine Learning Toolkit The Explorer Classification and Regression.
Learning Theory Put to Work Isabelle Guyon
Introduction to WEKA Aaron 2/13/2009. Contents Introduction to weka Download and install weka Basic use of weka Weka API Survey.
Statistical Learning: Pattern Classification, Prediction, and Control Peter Bartlett August 2002, UC Berkeley CIS.
Evaluation of Results (classifiers, and beyond) Biplav Srivastava Sources: [Witten&Frank00] Witten, I.H. and Frank, E. Data Mining - Practical Machine.
Machine Learning Usman Roshan Dept. of Computer Science NJIT.
1 How to use Weka How to use Weka. 2 WEKA: the software Waikato Environment for Knowledge Analysis Collection of state-of-the-art machine learning algorithms.
Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem
Classification III Tamara Berg CS Artificial Intelligence Many slides throughout the course adapted from Svetlana Lazebnik, Dan Klein, Stuart Russell,
Introduction Mohammad Beigi Department of Biomedical Engineering Isfahan University
Data Mining Joyeeta Dutta-Moscato July 10, Wherever we have large amounts of data, we have the need for building systems capable of learning information.
Perceptual and Sensory Augmented Computing Machine Learning, WS 13/14 Machine Learning – Lecture 14 Introduction to Regression Bastian Leibe.
CPSC 502, Lecture 15Slide 1 Introduction to Artificial Intelligence (AI) Computer Science cpsc502, Lecture 16 Nov, 3, 2011 Slide credit: C. Conati, S.
Classification: Feature Vectors
Department of Computer Science, University of Waikato, New Zealand Eibe Frank WEKA: A Machine Learning Toolkit The Explorer Classification and Regression.
Classifiers Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/09/15.
Maximum Entropy (ME) Maximum Entropy Markov Model (MEMM) Conditional Random Field (CRF)
Text Classification 2 David Kauchak cs459 Fall 2012 adapted from:
1 Chapter 6. Classification and Prediction Overview Classification algorithms and methods Decision tree induction Bayesian classification Lazy learning.
Perceptual and Sensory Augmented Computing Machine Learning WS 13/14 Machine Learning – Lecture 3 Probability Density Estimation II Bastian.
Classification Derek Hoiem CS 598, Spring 2009 Jan 27, 2009.
Christoph Eick: Learning Models to Predict and Classify 1 Learning from Examples Example of Learning from Examples  Classification: Is car x a family.
CS 188: Artificial Intelligence Fall 2007 Lecture 25: Kernels 11/27/2007 Dan Klein – UC Berkeley.
CORRECTIONS L2 regularization ||w|| 2 2, not ||w|| 2 Show second derivative is positive or negative on exams, or show convex – Latter is easier (e.g. x.
Machine Learning with Discriminative Methods Lecture 05 – Doing it 1 CS Spring 2015 Alex Berg.
Machine Learning with Discriminative Methods Lecture 00 – Introduction CS Spring 2015 Alex Berg.
Data Mining Practical Machine Learning Tools and Techniques By I. H. Witten, E. Frank and M. A. Hall DM Finals Study Guide Rodney Nielsen.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Elements of a Discrete Model Evaluation.
CS 189 Brian Chu Slides at: brianchu.com/ml/ brianchu.com/ml/ Office Hours: Cory 246, 6-7p Mon. (hackerspace lounge)
Announcements: Midterm 2  Monday 4/20, 6-9pm  Rooms:  2040 Valley LSB [Last names beginning with A-C]  2060 Valley LSB [Last names beginning with D-H]
Web-Mining Agents Prof. Dr. Ralf Möller Universität zu Lübeck Institut für Informationssysteme Tanya Braun (Übungen)
Machine Learning: A Brief Introduction Fu Chang Institute of Information Science Academia Sinica ext. 1819
1 Bios 760R, Lecture 1 Overview  Overview of the course  Classification and Clustering  The “curse of dimensionality”  Reminder of some background.
CSE344/544 Machine Learning Richa Singh Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
Lecture 1: Introduction to Machine Learning Isabelle Guyon
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
Machine Learning Usman Roshan Dept. of Computer Science NJIT.
1 C.A.L. Bailer-Jones. Machine Learning. Data exploration and dimensionality reduction Machine learning, pattern recognition and statistical data modelling.
PREDICT 422: Practical Machine Learning
Who am I? Work in Probabilistic Machine Learning Like to teach 
Lecture 17. Boosting¶ CS 109A/AC 209A/STAT 121A Data Science: Harvard University Fall 2016 Instructors: P. Protopapas, K. Rader, W. Pan.
Project 4: Facial Image Analysis with Support Vector Machines
Data Mining: Concepts and Techniques (3rd ed
Supervised Learning Seminar Social Media Mining University UC3M
Christoph Eick: Learning Models to Predict and Classify 1 Learning from Examples Example of Learning from Examples  Classification: Is car x a family.
Basic machine learning background with Python scikit-learn
WEKA.
Special Topics in Data Mining Applications Focus on: Text Mining
CS639: Data Management for Data Science
Derek Hoiem CS 598, Spring 2009 Jan 27, 2009
Presentation transcript:

[slides prises du cours cs UC Berkeley (2006 / 2009)]

Classification (reminder) X ! Y Anything: continuous ( ,  d, …) discrete ({0,1}, {1,…k}, …) structured (tree, string, …) … discrete: – {0,1}binary – {1,…k}multi-class – tree, etc.structured

Classification (reminder) X Anything: continuous ( ,  d, …) discrete ({0,1}, {1,…k}, …) structured (tree, string, …) …

Classification (reminder) X Anything: continuous ( ,  d, …) discrete ({0,1}, {1,…k}, …) structured (tree, string, …) … Perceptron Logistic Regression Support Vector Machine Decision Tree Random Forest Kernel trick

Regression X ! Y continuous: – ,  d Anything: continuous ( ,  d, …) discrete ({0,1}, {1,…k}, …) structured (tree, string, …) … 1

degree 15 overfitting!

 Between two models / hypotheses which explain as well the data, choose the simplest one  In Machine Learning: ◦ we usually need to tradeoff between  training error  model complexity ◦ can be formalized precisely in statistics (bias- variance tradeoff, etc.)

training errormodel complexity

 Logiciels: ◦ Weka (Java): ◦ RapidMiner (nicer GUI?): ◦ SciKit Learn (Python):  Livres: ◦ Pattern Classification (Duda, Hart & Stork) ◦ Pattern Recognition and Machine Learning (Bishop) ◦ Data Mining (Witten, Frank & Hall) ◦ The Elements of Statistical Learning (Hastie, Tibshirani, Friedman)  Programmer en python: ◦ cours cs188 de Dan Klein à Berkeley:

Kernel Regression Kernel regression (sigma=1)