Computational Intelligence Winter Term 2015/16 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

Multi-Layer Perceptron (MLP)
Computational Intelligence Winter Term 2009/10 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Chapter3 Pattern Association & Associative Memory
Pattern Association.
Computational Intelligence Winter Term 2011/12 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Weighted Matching-Algorithms, Hamiltonian Cycles and TSP
Computational Intelligence Winter Term 2009/10 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Computational Intelligence Winter Term 2010/11 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Computational Intelligence Winter Term 2011/12 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Computational Intelligence Winter Term 2014/15 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Computational Intelligence
Lecture 24 MAS 714 Hartmut Klauck
Introduction to Neural Networks Computing
Computational Intelligence Winter Term 2014/15 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Totally Unimodular Matrices
Online Social Networks and Media. Graph partitioning The general problem – Input: a graph G=(V,E) edge (u,v) denotes similarity between u and v weighted.
Entropy Rates of a Stochastic Process
1 Neural networks 3. 2 Hopfield network (HN) model A Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Hypercubes and Neural Networks bill wolfe 10/23/2005.
December 7, 2010Neural Networks Lecture 21: Hopfield Network Convergence 1 The Hopfield Network The nodes of a Hopfield network can be updated synchronously.
CS623: Introduction to Computing with Neural Nets (lecture-10) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Chapter 6 Associative Models. Introduction Associating patterns which are –similar, –contrary, –in close proximity (spatial), –in close succession (temporal)
© Negnevitsky, Pearson Education, Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works Introduction, or.
Neural NetworksNN 11 Neural netwoks thanks to: Basics of neural network theory and practice for supervised and unsupervised.
10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network.
Neural Networks Architecture Baktash Babadi IPM, SCS Fall 2004.
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
Hebbian Coincidence Learning
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
IE 585 Associative Network. 2 Associative Memory NN Single-layer net in which the weights are determined in such a way that the net can store a set of.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 29 Nov 11, 2005 Nanjing University of Science & Technology.
Computational Intelligence Winter Term 2015/16 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Computational Intelligence Winter Term 2015/16 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
CS623: Introduction to Computing with Neural Nets (lecture-12) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Lecture 9 Model of Hopfield
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Neural NetworksNN 21 Architecture We consider the architecture: feed- forward NN with one layer It is sufficient to study single layer perceptrons with.
Computational Intelligence Winter Term 2015/16 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Assocative Neural Networks (Hopfield) Sule Yildirim 01/11/2004.
Lecture 39 Hopfield Network
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
Part 3 Linear Programming
Computational Intelligence
Chapter 6 Associative Models
Ch7: Hopfield Neural Model
NEURONAL DYNAMICS 2: ACTIVATION MODELS
Computational Intelligence
Computational Intelligence
Part 3 Linear Programming
Computational Intelligence
Chap 3. The simplex method
Computational Intelligence
Neural Networks Chapter 4
Computational Intelligence
Recurrent Networks A recurrent network is characterized by
Lecture 39 Hopfield Network
Computational Intelligence
Computational Intelligence
Part 3 Linear Programming
Computational Intelligence
Computational Intelligence
Computational Intelligence
CS623: Introduction to Computing with Neural Nets (lecture-11)
Computational Intelligence
CSC 578 Neural Networks and Deep Learning
Presentation transcript:

Computational Intelligence Winter Term 2015/16 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund

Lecture 04 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 2 Plan for Today ● Bidirectional Associative Memory (BAM)  Fixed Points  Concept of Energy Function  Stable States = Minimizers of Energy Function ● Hopfield Network  Convergence  Application to Combinatorial Optimization

Lecture 04 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 3 Bidirectional Associative Memory (BAM) Network Model x 1 x 2 x 3 x n y 1 y 2 y 3 y k w 11 w nk... fully connected bidirectional edges synchonized: step t : data flow from x to y step t + 1 : data flow from y to x x, y : row vectors W : weight matrix W‘ : transpose of W bipolar inputs  {-1,+1} y (0) = sgn(x (0) W) x (1) = sgn(y (0) W‘) y (1) = sgn(x (1) W) x (2) = sgn(y (1) W‘)... start:

Lecture 04 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 4 Bidirectional Associative Memory (BAM) Fixed Points Definition (x, y) is fixed point of BAM iff y = sgn(x W) and x‘ = sgn(W y‘).□ Set W = x‘ y. (note: x is row vector) y = sgn( x W ) = sgn( x (x‘ y) ) = sgn( (x x‘) y) = sgn( || x || 2 y) = y > 0 (does not alter sign) x‘ = sgn( W y‘) = sgn( (x‘y) y‘ ) = sgn( x‘ (y y‘) ) = sgn( x‘ || y || 2 ) = x‘ > 0 (does not alter sign) Theorem: If W = x‘y then (x,y) is fixed point of BAM.□

Lecture 04 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 5 Bidirectional Associative Memory (BAM) Concept of Energy Function given: BAM with W = x‘y  (x,y) is stable state of BAM starting point x (0)  y (0) = sgn( x (0) W )  excitation e‘ = W (y (0) )‘  if sign( e‘ ) = x (0) then ( x (0), y (0) ) stable state true if e‘ close to x (0)  small angle between e‘ and x (0) x (0) = (1, 1) recall: 1 0 small angle   large cos(  )

Lecture 04 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 6 Bidirectional Associative Memory (BAM) Concept of Energy Function required: small angle between e‘ = W y (0) ‘ and x (0)  larger cosine of angle indicates greater similarity of vectors   e‘ of equal size: try to maximize x (0) e‘ = || x (0) || ¢ || e || ¢ cos Å (x (0),e) fixed → max!  maximize x (0) e‘ = x (0) W y (0) ‘  identical to minimize -x (0) W y (0) ‘ Definition Energy function of BAM at iteration t is E( x (t), y (t) ) = – x (t) W y (t) ‘□ 1212 –

Lecture 04 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 7 Bidirectional Associative Memory (BAM) Stable States Theorem An asynchronous BAM with arbitrary weight matrix W reaches steady state in a finite number of updates. Proof: BAM asynchronous  select neuron at random from left or right layer, compute its excitation and change state if necessary (states of other neurons not affected) excitations

Lecture 04 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 8 Bidirectional Associative Memory (BAM) neuron i of left layer has changed  sgn(x i ) ≠ sgn(b i )  x i was updated to x i = –x i ~ xixi bibi x i - x i > 0< 0 +1< 0> 0 ~ < 0 use analogous argumentation if neuron of right layer has changed  every update (change of state) decreases energy function  since number of different bipolar vectors is finite update stops after finite #updates remark: dynamics of BAM get stable in local minimum of energy function! q.e.d.

Lecture 04 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 9 Hopfield Network special case of BAM but proposed earlier (1982) characterization: neurons preserve state until selected at random for update n neurons fully connected symmetric weight matrix no self-loops (→ zero main diagonal entries) thresholds , neuron i fires if excitations larger than  i energy of state x is transition: select index k at random, new state is where

Lecture 04 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 10 Hopfield Network Theorem: Hopfield network converges to local minimum of energy function after a finite number of updates. □ Proof: assume that x k has been updated  = 0 if i ≠ k

Lecture 04 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 11 Hopfield Network = 0 if j ≠ k (rename j to i, recall W = W‘, w kk = 0) excitation e k > 0 if x k < 0 and vice versa > 0since: q.e.d.

Lecture 04 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 12 Hopfield Network Application to Combinatorial Optimization Idea: transform combinatorial optimization problem as objective function with x  {-1,+1} n rearrange objective function to look like a Hopfield energy function extract weights W and thresholds  from this energy function initialize a Hopfield net with these parameters W and  run the Hopfield net until reaching stable state (= local minimizer of energy function) stable state is local minimizer of combinatorial optimization problem

Lecture 04 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 13 Hopfield Network Example I: Linear Functions Evidently:   fixed point reached after  (n log n) iterations on average [ proof: → black board ]

Lecture 04 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 14 Hopfield Network Example II: MAXCUT given: graph with n nodes and symmetric weights  ij =  ji,  ii = 0, on edges task: find a partition V = (V 0, V 1 ) of the nodes such that the weighted sum of edges with one endpoint in V 0 and one endpoint in V 1 becomes maximal encoding:  i=1,...,n: y i = 0, node i in set V 0 ; y i = 1, node i in set V 1 objective function: preparations for applying Hopfield network step 1: conversion to minimization problem step 2: transformation of variables step 3: transformation to “Hopfield normal form“ step 4: extract coefficients as weights and thresholds of Hopfield net

Lecture 04 step 2: transformation of variables  y i = (x i +1) / 2  G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 15 Hopfield Network Example II: MAXCUT (continued) step 1: conversion to minimization problem  multiply function with -1  E(y) = -f(y) → min! constant value (does not affect location of optimal solution)

Lecture 04 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 16 Hopfield Network Example II: MAXCUT (continued) step 4: extract coefficients as weights and thresholds of Hopfield net step 3: transformation to “Hopfield normal form“ w ij 0‘ remark: