# Neural Networks Basic concepts ArchitectureOperation.

## Presentation on theme: "Neural Networks Basic concepts ArchitectureOperation."— Presentation transcript:

Neural Networks Basic concepts ArchitectureOperation

2 What is a NN A large number of very simple neuron-like processing elements A large number of very simple neuron-like processing elements A large number of weighed connections between the elements. The weights encode the knowledge of the network A large number of weighed connections between the elements. The weights encode the knowledge of the network Highly parallel, distributed control Highly parallel, distributed control An emphasis on learning internal representations automatically An emphasis on learning internal representations automatically

3 Architecture Simple processing elements linked together Simple processing elements linked together Numeric inputs with weights Numeric inputs with weights A function that computes the output A function that computes the output The links The links Excitatory – positive weight Excitatory – positive weight Inhibitory – negative weight Inhibitory – negative weight

4 Basic Computational Neuron

5 Network Layers Input layer - nodes that accept input Input layer - nodes that accept input Output layer - nodes that deliver output - the result of the network processing Output layer - nodes that deliver output - the result of the network processing Hidden layers - intermediate layers, optional, used for better learning Hidden layers - intermediate layers, optional, used for better learning

6

7 Feed-Forward and Recurrent NN Feed-Forward : Feed-Forward : Signals to travel one way only: from input to output Signals to travel one way only: from input to output The output of any layer does not affect that same layer The output of any layer does not affect that same layer Recurrent : Recurrent : Signals travel in both directions by introducing loops in the network Signals travel in both directions by introducing loops in the network

8 Operation Operation 1. Task: Find a mapping function between a given set of inputs and a predefined set of outputs 2. Algorithm: 1) Feed the input 2) Compare the obtained output with the desired output 3) If different – adjust the weights 4) Repeat until there is no difference How to adjust the weights?

9 Supervised learning: Supervised learning: Feed-Forward – by an algorithm implemented into the NN driver Feed-Forward – by an algorithm implemented into the NN driver Recurrent – using the back-propagation algorithm (quite complicated – not explained here) Recurrent – using the back-propagation algorithm (quite complicated – not explained here) Unsupervised learning: Unsupervised learning: The desired outcome is not known explicitly in advance. The desired outcome is not known explicitly in advance. Feedback from the environment Feedback from the environment An evaluation function An evaluation function

10 Advantages and Problems Advantages: NN are robust – even if some inputs are missing, the results will be correct in most of the cases Problems: How to specify the inputs How to specify the inputs Local NN – each input corresponds to an item in our model Local NN – each input corresponds to an item in our model Distributed NN – the set of inputs correspond to an item Distributed NN – the set of inputs correspond to an item How to implement reasoning: How to implement reasoning: Hybrid networks: symbolic networks and neural networks Hybrid networks: symbolic networks and neural networks