Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS344: Introduction to Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 31 and 32– Brain and Perceptron.

Similar presentations


Presentation on theme: "CS344: Introduction to Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 31 and 32– Brain and Perceptron."— Presentation transcript:

1 CS344: Introduction to Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 31 and 32– Brain and Perceptron

2 The human brain Seat of consciousness and cognition Perhaps the most complex information processing machine in nature Historically, considered as a monolithic information processing machine

3 Beginner’s Brain Map Forebrain (Cerebral Cortex): Language, maths, sensation, movement, cognition, emotion Cerebellum: Motor Control Midbrain: Information Routing; involuntary controls Hindbrain: Control of breathing, heartbeat, blood circulation Spinal cord: Reflexes, information highways between body & brain

4 Brain : a computational machine? Information processing: brains vs computers  brains better at perception / cognition  slower at numerical calculations  parallel and distributed Processing  associative memory

5 Brain : a computational machine? (contd.) Evolutionarily, brain has developed algorithms most suitable for survival Algorithms unknown: the search is on Brain astonishing in the amount of information it processes –Typical computers: 10 9 operations/sec –Housefly brain: 10 11 operations/sec

6 Brain facts & figures Basic building block of nervous system: nerve cell (neuron) ~ 10 12 neurons in brain ~ 10 15 connections between them Connections made at “synapses” The speed: events on millisecond scale in neurons, nanosecond scale in silicon chips

7 Neuron - “classical” Dendrites –Receiving stations of neurons –Don't generate action potentials Cell body –Site at which information received is integrated Axon –Generate and relay action potential –Terminal Relays information to next neuron in the pathway http://www.educarer.com/images/brain-nerve-axon.jpg

8 Computation in Biological Neuron Incoming signals from synapses are summed up at the soma, the biological “inner product” On crossing a threshold, the cell “fires” generating an action potential in the axon hillock region Synaptic inputs: Artist’s conception

9 Symbolic AI Connectionist AI is contrasted with Symbolic AI Symbolic AI - Physical Symbol System Hypothesis Every intelligent system can be constructed by storing and processing symbols and nothing more is necessary. Symbolic AI has a bearing on models of computation such as Turing Machine Von Neumann Machine Lambda calculus

10 Turing Machine & Von Neumann Machine

11 Challenges to Symbolic AI Motivation for challenging Symbolic AI A large number of computations and information process tasks that living beings are comfortable with, are not performed well by computers! The Differences Brain computation in living beings TM computation in computers Pattern Recognition Numerical Processing Learning oriented Programming oriented Distributed & parallel processing Centralized & serial processing Content addressable Location addressable

12 Neural Computation

13 Some Observation on the brain Ray Kurzweil, The Singularity is Near, 2005. Machines will be able to out-think people within a few decades. But brain arose through natural selection Contains layers of systems for that arose for one function and then were adopted for another even if they do not work perfectly

14 Difference between brain and computers Highly efficient use of energy in brain High Adaptability Tremendous amount of compressions: space is a premium for the cranium One cubic centimeter of numna brain tissue contains –50 million neurons –Several hundred miles of axons which are “wires” for transmitting signals –Close to trillion synapses- the connections between neurons

15 Immense memory capacity 1 cc contains 1 terabyte of information About 1000 cc makes up the whole brain So about 1 million gigabyte or 1 petabyte of information Entire archived cntent of internet is 3 petabyte

16 Moore’s law Every year doubles the storage capacity Single computer the size of brain will contain a petabyte of information by 2030 Question mark: Power Consumption?

17 Power issues By 2025, the memory of an artificial brain will use nearly a gigawatt of power: the amount currently consumed by entire Washington DC Contrastedly: brain uses only 12 watts or power, less than the energy used by a typical refrigerator light

18 Brain vs. computer’s procesing Associative memory vs. adressable memory Parallel Distributed Processing (PDP) vs. Serial computation Fast responses to complex situations vs. precisely repeatable steps Preference for Approximations and “good enough” solutions vs exact solutions Mistakes and biases vs. cold logic

19 Brain vs. Computers (contd.) Excellent pattern recognition vs. excellent number crunching Emotion- brain’s steerman- assigning values to experiences and future possibilities vs. computer being insensitive to emotions Evaluate potential outcomes efficiently and rapidly when information is uncertain vs. “Garbage in Garbage out” situation”

20 Perceptron

21 The Perceptron Model A perceptron is a computing element with input lines having associated weights and the cell having a threshold value. The perceptron model is motivated by the biological neuron. Output = y wnwn W n-1 w1w1 X n-1 x1x1 Threshold = θ

22 θ 1 y Step function / Threshold function y = 1 for Σw i x i >=θ =0 otherwise ΣwixiΣwixi

23 Features of Perceptron Input output behavior is discontinuous and the derivative does not exist at Σw i x i = θ Σw i x i - θ is the net input denoted as net Referred to as a linear threshold element - linearity because of x appearing with power 1 y= f(net): Relation between y and net is non-linear

24 Computation of Boolean functions AND of 2 inputs X1 x2 y 0 0 0 0 10 1 00 1 11 The parameter values (weights & thresholds) need to be found. y w1w1 w2w2 x1x1 x2x2 θ

25 Computing parameter values w1 * 0 + w2 * 0 = 0; since y=0 w1 * 0 + w2 * 1 <= θ  w2 <= θ; since y=0 w1 * 1 + w2 * 0 <= θ  w1 <= θ; since y=0 w1 * 1 + w2 *1 > θ  w1 + w2 > θ; since y=1 w1 = w2 = = 0.5 satisfy these inequalities and find parameters to be used for computing AND function.

26 Other Boolean functions OR can be computed using values of w1 = w2 = 1 and = 0.5 XOR function gives rise to the following inequalities: w1 * 0 + w2 * 0 = 0 w1 * 0 + w2 * 1 > θ  w2 > θ w1 * 1 + w2 * 0 > θ  w1 > θ w1 * 1 + w2 *1 <= θ  w1 + w2 <= θ No set of parameter values satisfy these inequalities.

27 Threshold functions n # Boolean functions (2^2^n) #Threshold Functions (2 n^2 ) 1 4 4 2 16 14 3 256 128 4 64K 1008 Functions computable by perceptrons - threshold functions #TF becomes negligibly small for larger values of #BF. For n=2, all functions except XOR and XNOR are computable.


Download ppt "CS344: Introduction to Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 31 and 32– Brain and Perceptron."

Similar presentations


Ads by Google