Fitting normal distribution: ML 1Computer vision: models, learning and inference. ©2011 Simon J.D. Prince.

Slides:



Advertisements
Similar presentations
Pattern Recognition and Machine Learning
Advertisements

Machine Learning and Data Mining Linear regression
Bayesian inference of normal distribution
Pattern Recognition and Machine Learning
Pattern Recognition and Machine Learning
Machine learning continued Image source:
Computer vision: models, learning and inference Chapter 8 Regression.
CS Statistical Machine learning Lecture 13 Yuan (Alan) Qi Purdue CS Oct
Computer vision: models, learning and inference Chapter 18 Models for style and identity.
CHAPTER 21 Inferential Statistical Analysis. Understanding probability The idea of probability is central to inferential statistics. It means the chance.
Ch11 Curve Fitting Dr. Deshi Ye
Simple Linear Regression and Correlation
Computer vision: models, learning and inference
Pattern Recognition and Machine Learning
Basics of Statistical Estimation. Learning Probabilities: Classical Approach Simplest case: Flipping a thumbtack tails heads True probability  is unknown.
Maximum Likelihood We have studied the OLS estimator. It only applies under certain assumptions In particular,  ~ N(0, 2 ) But what if the sampling distribution.
4-1 Statistical Inference The field of statistical inference consists of those methods used to make decisions or draw conclusions about a population.
T T07-01 Sample Size Effect – Normal Distribution Purpose Allows the analyst to analyze the effect that sample size has on a sampling distribution.
Arizona State University DMML Kernel Methods – Gaussian Processes Presented by Shankar Bhargav.
Computer vision: models, learning and inference Chapter 10 Graphical Models.
Computer vision: models, learning and inference
Computer vision: models, learning and inference Chapter 3 Common probability distributions.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Statistical inference.
Computer vision: models, learning and inference Chapter 5 The Normal Distribution.
Computer vision: models, learning and inference Chapter 6 Learning and Inference in Vision.
Binary Variables (1) Coin flipping: heads=1, tails=0 Bernoulli Distribution.
Computer vision: models, learning and inference
Midterm Review Rao Vemuri 16 Oct Posing a Machine Learning Problem Experience Table – Each row is an instance – Each column is an attribute/feature.
Computer vision: models, learning and inference Chapter 19 Temporal models.
Data Handling & Analysis BD7054 Scatter Plots Andrew Jackson
Conceptual Foundations © 2008 Pearson Education Australia Lecture slides for this course are based on teaching materials provided/referred by: (1) Statistics.
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 3: LINEAR MODELS FOR REGRESSION.
1 LING 696B: Midterm review: parametric and non-parametric inductive inference.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Multiple Regression Petter Mostad Review: Simple linear regression We define a model where are independent (normally distributed) with equal.
Gaussian processes for time-series modelling by S. Roberts, M. Osborne, M. Ebden, S. Reece, N. Gibson, and S. Aigrain Philosophical Transactions A Volume.
AP STATISTICS LESSON 14 – 1 ( DAY 1 ) INFERENCE ABOUT THE MODEL.
1 Bayesian Essentials Slides by Peter Rossi and David Madigan.
How Good is a Model? How much information does AIC give us? –Model 1: 3124 –Model 2: 2932 –Model 3: 2968 –Model 4: 3204 –Model 5: 5436.
Computer vision: models, learning and inference Chapter 2 Introduction to probability.
Computacion Inteligente Least-Square Methods for System Identification.
Bootstrapping James G. Anderson, Ph.D. Purdue University.
Ch 1. Introduction Pattern Recognition and Machine Learning, C. M. Bishop, Updated by J.-H. Eom (2 nd round revision) Summarized by K.-I.
CS Statistical Machine learning Lecture 7 Yuan (Alan) Qi Purdue CS Sept Acknowledgement: Sargur Srihari’s slides.
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 1: INTRODUCTION.
Stochasticity and Probability. A new approach to insight Pose question and think of the answer needed to answer it. Ask: How do the data arise? What is.
Computer vision: models, learning and inference
CEE 6410 Water Resources Systems Analysis
Inference for Regression (Chapter 14) A.P. Stats Review Topic #3
Computer vision: models, learning and inference
Figure Legend: From: Bayesian inference for psychometric functions
How Good is a Model? How much information does AIC give us?
Ch3: Model Building through Regression
Computer vision: models, learning and inference
Chapter Six Normal Curves and Sampling Probability Distributions
Computer vision: models, learning and inference
Computer vision: models, learning and inference
Latent Variables, Mixture Models and EM
Computer vision: models, learning and inference
Location-Scale Normal Model
More about Posterior Distributions
Review of Hypothesis Testing
Simple Linear Regression
Biointelligence Laboratory, Seoul National University
Ch11 Curve Fitting II.
7.1 Draw Scatter Plots & Best-Fitting Lines
7.1 – Functions of Several Variables
Parametric Methods Berlin Chen, 2005 References:
Objective- To graph a relationship in a table.
Maximum Likelihood We have studied the OLS estimator. It only applies under certain assumptions In particular,  ~ N(0, 2 ) But what if the sampling distribution.
Presentation transcript:

Fitting normal distribution: ML 1Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

Fitting a normal distribution: ML Plotted surface of likelihoods as a function of possible parameter values ML Solution is at peak 2Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

Predicting the world We need a model that mathematically relates input data x (e.g. visual data) and output data t (e.g. world state). The model specifies a family of possible relationships between x and t and the particular relationship is determined by the model parameters w. We need a learning algorithm that allows us to fit the parameters w using paired training examples {x i,t i } where we know both the measurements and the underlying state. We need an inference algorithm that takes a new observation x and uses the model to return the posterior P(t|x,w) over the state t. Alternately, it might return the MAP solution or draw samples from the posterior.

Regression Models 4Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

Arc Tan Functions 5Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

Radial basis functions 6Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

7

RBF Kernel Fits 8Computer vision: models, learning and inference. ©2011 Simon J.D. Prince

Curve Fitting Re-visited