Presentation is loading. Please wait.

Presentation is loading. Please wait.

Data representation techniques for adaptation Alexandra I. Cristea USI intensive course “Adaptive Systems” April-May 2003.

Similar presentations


Presentation on theme: "Data representation techniques for adaptation Alexandra I. Cristea USI intensive course “Adaptive Systems” April-May 2003."— Presentation transcript:

1 Data representation techniques for adaptation Alexandra I. Cristea USI intensive course “Adaptive Systems” April-May 2003

2 Overview: Data representation 1.Data or knowledge? 2.Subsymbolic vs. symbolic techniques 3.Symbolic representation 4.Example 5.Subsymbolic reprensentation 6.Example

3 Data or knowledge? Data for AS becomes often knowledge –data < information < knowledge We divide into: –Symbolic –Sub-symbolic knowledge representation

4 Data representation techniques for adaptation Symbolic AI and knowledge representation, such as: –Concept Maps –Probabilistic AI (belief networks) see UM course Sub-symbolic: Machine learning, such as: –Neural Networks

5 Symbolic Knowledge Representation

6 Symbolic AI and knowledge representation Static knowledge –Concept mapping –terminological knowledge –concept subsumption (inclusion) inference Dynamic Knowledge –ontological engineering, e.g., temporal representation and reasoning –planning

7 Concept Maps Example

8 Proposition: Without the industrial chemical reduction of atmospheric nitrogen, starvation would be rampant in third world countries. FOOD Human Health and Survival Contains Required for and Requiring more Essential Amino Acids Animals Used for Such as Made by Plants GrainsLegumes Required for growth of Symbiotic Bacteria “Fixed” Nitrogen Possess That produce Agricultural Practices Population Growth Politics Economics Distribution Climate Starvation and Famine Malthus 1819 Eastern Europe India Africa Deprivation leads to Can be limited by and Such as in PesticidesHerbicidesGenetics & Breeding Irrigation Fertilizer Which significantly supplements naturally Such as Predicted by Can be increased by NH 3 Haber Process Atmospheric N 2 Protein Includes Eaten by Used by humans as

9 Constructing a CM Brainstorming Phase: Organizing Phase: create groups and sub- groups of related items. Layout Phase: Linking Phase: lines with arrows

10 Reviewing the CM Accuracy and Thoroughness. –Are the concepts and relationships correct? Are important concepts missing? Are any misconceptions apparent? Organization. –Was the concept map laid out in a way that higher order relationships are apparent and easy to follow? Does it have a representative title? Appearance. –spelling, etc.? Creativity.

11 Sub-symbolic knowledge representation

12 Subsymbolic systems human-like information processing: learning from examples, context sensitivity, generalization, robustness of behaviour, and intuitive reasoning

13 Some notes on NN Example

14 Why NN? To learn how our brain works (!!) High computation rate technology Intelligence User-friendly-ness

15 Applications vs Why NNs?

16 Applications Why NNs?

17 Man-machine hardware comparison

18 Man-machine information processing

19 What are humans good at and machines not? Humans: –pattern recognition –Reasoning with incomplete knowledge Computers: –Precise computing –Number crunching

20

21 The Biological Neuron

22 (very small) Biological NN

23 Purkinje cell

24 Spike (width 0.2 – 5ms)

25 Firing Resulting signal –Excitatory: encourages firing of the next neuron –Inhibitory: Discourages firing of the next neuron

26 What does a neuron do? Sums its inputs Decides if to fire or not with respect to a threshold But: limited capacity: –Neuron cannot fire all the time –Refractory period: 10ms – min time to fire again –So:  max. firing frequency: 100 spikes/ sec

27 Hebbian learning rule (1949) If neuron A repeatedly and persistently contributes to the firing of neuron B, than the connection between A and B will get stronger. If neuron A does not contribute to the firing of neuron B for a long period of time, than the connection between A and B becomes weaker.

28 Different size synapses

29 Summarizing A neuron doesn’t fire if cumulated activity below threshold If the activity is above threshold, neuron fires (produces a spike) Firing frequency increases with accumulated activity until max. firing frequency reached

30 The ANN

31 The Artificial NeuronInput Output Functions: Inside : Synapse Outside :f =threshold

32 An ANN Input Output Layer :1 Layer :2 Layer :3 Black Box

33 Let’s look in the Black Box!

34 NEURON LINK W: weight neuron 1 neuron 2 V1 value V2=w*v1 value

35 ANN Pulse train – average firing frequency  0 Model of synapse (connecting element) –Real number w  0 : excitatory –Real number w  0 : inhibitory N(i) – set of neurons that have a connection to neuron i –j  N(i) –wij – weight of connection of j to i

36 neuron computation V1 W1 V2 W2 。。。 Vn Wn O S= ΣVi *W i - b i=1..n internal activation fct O = f (S) external activation fct

37 Typical input output relation f 1.Standard sigmoid fct.: f(z)= 1/(1+e -z ) 2.Discrete neuron: fires at max. speed, or does not fire xi={0,1}; f(z) = 1, z>0; 0 z  0

38 Other I-O functions f 3. Linear neuron f(z)=z output x i =z i –  = … 4. Stochastic neuron: xi  {0,1}; output 0 or 1 input z i =  j w ij v i –  i i probability that neuron fires f(z i ) probability that it doesn’t fire 1- f(z i )

39 Feedforward NNs

40 Recurrent NNs

41 Summarizing ANNs Feedforward network, layered –No connection from the output to the input, at each layer but also at neuron level Recurrent network –Anything is allowed – cycles, etc.


Download ppt "Data representation techniques for adaptation Alexandra I. Cristea USI intensive course “Adaptive Systems” April-May 2003."

Similar presentations


Ads by Google