Lecture 24 Radial Basis Network (I)

Slides:



Advertisements
Similar presentations
Solve a System Algebraically
Advertisements

Pattern Recognition and Machine Learning: Kernel Methods.
Intro. ANN & Fuzzy Systems Lecture 8. Learning (V): Perceptron Learning.
1cs542g-term Notes  Added required reading to web (numerical disasters)
Pattern Recognition and Machine Learning
1 Image Filtering Readings: Ch 5: 5.4, 5.5, 5.6,5.7.3, 5.8 (This lecture does not follow the book.) Images by Pawan SinhaPawan Sinha formal terminology.
Machine Learning CUNY Graduate Center Lecture 3: Linear Regression.
Computer Graphics Recitation The plan today Least squares approach  General / Polynomial fitting  Linear systems of equations  Local polynomial.
Chapter 6-2 Radial Basis Function Networks 1. Topics Basis Functions Radial Basis Functions Gaussian Basis Functions Nadaraya Watson Kernel Regression.
1 Lecture 4 Maximal Flow Problems Set Covering Problems.
Machine Learning CUNY Graduate Center Lecture 3: Linear Regression.
Goal: Solve a system of linear equations in two variables by the linear combination method.
Linear algebra: matrix Eigen-value Problems Eng. Hassan S. Migdadi Part 1.
EASTERN MEDITERRANEAN UNIVERSITY Department of Industrial Engineering Non linear Optimization Spring Instructor: Prof.Dr.Sahand Daneshvar Submited.
Jump to first page Chapter 3 Splines Definition (3.1) : Given a function f defined on [a, b] and a set of numbers, a = x 0 < x 1 < x 2 < ……. < x n = b,
1 EEE 431 Computational Methods in Electrodynamics Lecture 18 By Dr. Rasime Uyguroglu
1 Section 5.3 Linear Systems of Equations. 2 THREE EQUATIONS WITH THREE VARIABLES Consider the linear system of three equations below with three unknowns.
Elimination Method: Solve the linear system. -8x + 3y=12 8x - 9y=12.
Do Now (3x + y) – (2x + y) 4(2x + 3y) – (8x – y)
8 TECHNIQUES OF INTEGRATION. Due to the Fundamental Theorem of Calculus (FTC), we can integrate a function if we know an antiderivative, that is, an indefinite.
Differential Equations Linear Equations with Variable Coefficients.
Section 3.5 Solving Systems of Linear Equations in Two Variables by the Addition Method.
SOLVING SYSTEMS USING ELIMINATION 6-3. Solve the linear system using elimination. 5x – 6y = -32 3x + 6y = 48 (2, 7)
Kernel Methods Arie Nakhmani. Outline Kernel Smoothers Kernel Density Estimators Kernel Density Classifiers.
Copyright © 2010 Pearson Education, Inc. All rights reserved Sec
Support Vector Machine: An Introduction. (C) by Yu Hen Hu 2 Linear Hyper-plane Classifier For x in the side of o : w T x + b  0; d = +1; For.
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
Intro. ANN & Fuzzy Systems Lecture 24 Radial Basis Network (I)
Lecture 39 Hopfield Network
EEE 431 Computational Methods in Electrodynamics
Our task is to estimate the axial displacement u at any section x
Mathematics.
3.2.1 – Solving Systems by Combinations
Linear homogeneous ODEn with constant coefficients
Solving Systems of Linear Equations
Solving Systems of Linear Equations in 3 Variables.
First order non linear pde’s
Lecture 12. MLP (IV): Programming & Implementation
CALCULUS AND ANALYTIC GEOMETRY CS 001 LECTURE 02.
Chapter 5 Series Solutions of Linear Differential Equations.
Lecture 19. SVM (III): Kernel Formulation
Linear Inequalities and Absolute Value
THE SUBSTITUTION METHOD
Lecture 25 Radial Basis Network (II)
Integration The Explanation of integration techniques.
Lecture 12. MLP (IV): Programming & Implementation
Systems of Linear Equations
Solving Linear Systems Algebraically
Lecture 22 Clustering (3).
Integration The Explanation of integration techniques.
Computational Intelligence
Simultaneous Equations
Computational Intelligence
Linear Algebra Lecture 3.
Lecture 5. Learning (II) Sampling
Lecture 18. SVM (II): Non-separable Cases
Solving Systems of Linear Equations in 3 Variables.
Lecture 39 Hopfield Network
Systems of Equations Solve by Graphing.
Computational Intelligence
Solve the linear system.
Lecture 8. Learning (V): Perceptron Learning
Image Filtering Readings: Ch 5: 5. 4, 5. 5, 5. 6, , 5
Example 2B: Solving Linear Systems by Elimination
Simplex method (algebraic interpretation)
Lecture 16. Classification (II): Practical Considerations
Systems of three equations with three variables are often called 3-by-3 systems. In general, to find a single solution to any system of equations,
Computational Intelligence
Multivariable optimization with no constraints
Introduction to Machine Learning
Presentation transcript:

Lecture 24 Radial Basis Network (I)

Outline Interpolation Problem Formulation Radial Basis Network Type 1 (C) 2001 by Yu Hen Hu

What is Radial Basis Function? RBF is a kernel function that is symmetric w. r. t. origin. Hence its variable is r that is the norm-distance from origin. Examples of RBF (C) 2001 by Yu Hen Hu

Interpolation Problem Formulation Radial Basis function for interpolation: Given {xi; 1  i  K} and {di; 1  i  K }, find a function F(x) that satisfies the interpolation condition: F(xi) = di 1  i  K (1) One possible choice of F(x) is a radial basis function of the following form: (2) where {xi; 1  i  K } are the centers of the radial basis functions. (C) 2001 by Yu Hen Hu

Solving Radial Basis Coefficients Substitute (1) into (2), we obtain a set of linear system of equations M w = d (3) where M = [M(i,j), 1  i, j,  K] is the interpolation matrix, M(i,j) = (||xi – xj||), w = [w1, w2, • • •, wK]t, and d = [d1, d2, • • •, dK]t. Given M and d, assuming the N centers are distinct, w can be solved as: w = M1d if M is non-singular. If the (r) = (r2 + c2)–1/2, or (r) = exp(–r2/(2s2)), it can further be shown that M is also positive definite. (C) 2001 by Yu Hen Hu

An Example Let F(–1) = 0.2, F(–0.5) = 0.5, and F(1) = –0.5. Use a triangular radial basis function j(r) = (1–r)[u(r) –u(r –1)] u(r) = 1 if r  0 and = 0 if r < 0. (C) 2001 by Yu Hen Hu rbfexample1.m

Example continued Use Gaussian rbfs: Parzen window: No weighting, and no target values of F(x) needed. , (C) 2001 by Yu Hen Hu

Example (Comparison) (C) 2001 by Yu Hen Hu

Regularization Problem Formulation When there are too many data points, the M matrix may become singular. This is because by impose a rbf to each data point, we have an over-determined system. Regularization is the mathematical tool that addresses this problem. By regularization, we add an additional term to the cost function that represents additional constraints on the solution: Regularization term (e.g.): (C) 2001 by Yu Hen Hu

Solution to Regularization Problem The solution to this regularization problem is G(x; xi) is the Green's function corresponding to the self-adjoin differential operator P*P such that P*P G(x; xi) = d(x – xi) A solution to the Green function that is of special interests to us is a multi-variate Gaussian function Hence With individual training data substituted into G(x, xi), a matrix equation (G + lI) w = d can be solved for w. (C) 2001 by Yu Hen Hu

Implementation Consideration However, other radial basis function other than the multi-variate Gaussian rbf can also be used. The regularized F(x) may no longer match data points exactly, but it will be more smooth. The value of l is usually determined empirically although generalized cross-validation (GCV) may be applied here. (C) 2001 by Yu Hen Hu