Presentation is loading. Please wait.

Presentation is loading. Please wait.

Implementation of Nonlinear Conjugate Gradient Method for MLP Matt Peterson ECE 539 December 10, 2001.

Similar presentations


Presentation on theme: "Implementation of Nonlinear Conjugate Gradient Method for MLP Matt Peterson ECE 539 December 10, 2001."— Presentation transcript:

1 Implementation of Nonlinear Conjugate Gradient Method for MLP Matt Peterson ECE 539 December 10, 2001

2 Introduction  Steepest Descent Gradient training method  Can oscillate  Can get caught at local minimums  Nonlinear Conjugate Gradient Method  “Optimization” approach  Converges quicker

3 The Algorithm  Initialization  Select initial weight vector w(0)  Use BP to compute gradient vector g(0)  Set s(0) = r(0) = -g(0)  Use line search to find η(n) that minimizes error  Test for convergence ( ||r(n)|| < ε||r(0)|| )  Update Weights ( w(n+1) = w(n) + η(n)s(n) )

4 The Algorithm (Continued)  Compute new gradient vector g(n+1)  Set r(n+1) = -g(n+1)  Calculate β(n+1) using Polak-Ribiére method β(n+1)=max( (r T (n+1)(r(n+1)-r(n))/r T (n)r(n), 0 )  Update direction vector s(n+1) = r(n+1) + β(n+1)s(n)

5 Software Implementation  Written in Matlab code  Similar structure and user interface as bp.m


Download ppt "Implementation of Nonlinear Conjugate Gradient Method for MLP Matt Peterson ECE 539 December 10, 2001."

Similar presentations


Ads by Google