1.5. Gaussian Processes 1.5.1. Examples 1.5.1.1. An introductory regression example 1.5.1.2. Fitting Noisy Data XIAO LIYING.

Slides:



Advertisements
Similar presentations
Generative Models Thus far we have essentially considered techniques that perform classification indirectly by modeling the training data, optimizing.
Advertisements

Regularized risk minimization
Regression “A new perspective on freedom” TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAA A A A A AAA A A.
Pattern Recognition and Machine Learning
Simple Linear Regression
Using R for Introductory Statistics CHAD BIRGER UNIVERSITY OF SIOUX FALLS.
Multivariate Methods Pattern Recognition and Hypothesis Testing.
Pattern Recognition and Machine Learning
The Kernel Trick Kenneth D. Harris 3/6/15.
Support Vector Machines (SVMs) Chapter 5 (Duda et al.)
x – independent variable (input)
Linear Regression with One Regression
Lecture 3 HSPM J716. New spreadsheet layout Coefficient Standard error T-statistic – Coefficient ÷ its Standard error.
What is Learning All about ?  Get knowledge of by study, experience, or being taught  Become aware by information or from observation  Commit to memory.
Simple Bayesian Supervised Models Saskia Klein & Steffen Bollmann 1.
Review Rong Jin. Comparison of Different Classification Models  The goal of all classifiers Predicating class label y for an input x Estimate p(y|x)
Machine Learning CUNY Graduate Center Lecture 3: Linear Regression.
University of Southern California Department Computer Science Bayesian Logistic Regression Model (Final Report) Graduate Student Teawon Han Professor Schweighofer,
Correlation Correlation is used to measure strength of the relationship between two variables.
Generalizing Linear Discriminant Analysis. Linear Discriminant Analysis Objective -Project a feature space (a dataset n-dimensional samples) onto a smaller.
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
Empirical Research Methods in Computer Science Lecture 7 November 30, 2005 Noah Smith.
Regression Regression relationship = trend + scatter
Simple Linear Regression. The term linear regression implies that  Y|x is linearly related to x by the population regression equation  Y|x =  +  x.
MACHINE LEARNING 8. Clustering. Motivation Based on E ALPAYDIN 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2  Classification problem:
Prediction of Influencers from Word Use Chan Shing Hei.
Christopher M. Bishop, Pattern Recognition and Machine Learning.
START OF DAY 5 Reading: Chap. 8. Support Vector Machine.
Sparse Kernel Methods 1 Sparse Kernel Methods for Classification and Regression October 17, 2007 Kyungchul Park SKKU.
CS558 Project Local SVM Classification based on triangulation (on the plane) Glenn Fung.
Data Mining Practical Machine Learning Tools and Techniques By I. H. Witten, E. Frank and M. A. Hall DM Finals Study Guide Rodney Nielsen.
Subjects Review Introduction to Statistical Learning Midterm: Thursday, October 15th :00-16:00 ADV2.
Regression Analysis Deterministic model No chance of an error in calculating y for a given x Probabilistic model chance of an error First order linear.
Regression Analysis1. 2 INTRODUCTION TO EMPIRICAL MODELS LEAST SQUARES ESTIMATION OF THE PARAMETERS PROPERTIES OF THE LEAST SQUARES ESTIMATORS AND ESTIMATION.
Foundational Issues Machine Learning 726 Simon Fraser University.
Once Size Does Not Fit All: Regressor and Subject Specific Techniques for Predicting Experience in Natural Environments Denis Chigirev, Chris Moore, Greg.
1.6 Modeling Real-World Data with Linear Functions Objectives Draw and analyze scatter plots. Write a predication equation and draw best-fit lines. Use.
CS Statistical Machine learning Lecture 7 Yuan (Alan) Qi Purdue CS Sept Acknowledgement: Sargur Srihari’s slides.
BUS 308 Complete Class BUS 308 Week 1 DQ 1 Data Scales BUS 308 Week 1 DQ 2 Probability BUS 308 Week 1 Quiz BUS 308 Week 1 Problem Set Week One BUS 308.
Principal Component Analysis (PCA)
Questions from lectures
DEEP LEARNING BOOK CHAPTER to CHAPTER 6
Machine Learning Logistic Regression
Sparse Kernel Machines
R. E. Wyllys Copyright 2003 by R. E. Wyllys Last revised 2003 Jan 15
Table 1. Advantages and Disadvantages of Traditional DM/ML Methods
Linear Regression (continued)
Machine Learning Basics
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 11.1: Least squares estimation CIS Computational.
Pawan Lingras and Cory Butz
Machine Learning Logistic Regression
Classification Discriminant Analysis
CSCI 5822 Probabilistic Models of Human and Machine Learning
10701 / Machine Learning Today: - Cross validation,
Lesson 5.7 Predict with Linear Models The Zeros of a Function
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Section 2: Linear Regression.
Generally Discriminant Analysis
Simple Linear Regression
Multivariate Methods Berlin Chen
Machine learning overview
Multivariate Methods Berlin Chen, 2005 References:
Descriptive Statistics Univariate Data
Ch 9.
Machine Learning – a Probabilistic Perspective
Regression and Correlation of Data
Chapter 12 Inference on the Least-squares Regression Line; ANOVA
Bootstrapping and Bootstrapping Regression Models
Probabilistic Surrogate Models
What is Artificial Intelligence?
Presentation transcript:

1.5. Gaussian Processes Examples An introductory regression example Fitting Noisy Data XIAO LIYING

1.5. Gaussian Processes Gaussian Processes for Machine Learning (GPML) is a generic supervised learning method primarily designed to solve regression problems. The advantages of Gaussian Processes for Machine Learning are: 1The prediction interpolates the observations (at least for regular correlation models). 2 The prediction is probabilistic (Gaussian) so that one can compute empirical confidence intervals and exceedance probabilities. 3 Versatile: different linear regression models and correlation models can be specified.linear regression models correlation models

The disadvantages of Gaussian Processes for Machine Learning include: 1It is not sparse. 2It loses efficiency in high dimensional spaces – namely when the number of features exceeds a few dozens. 3Classification is only a post-processing.

An introductory regression example The function g(x)=xsin(x). the function is evaluated onto a design of experiments.

Fitting Noisy Data