Presentation is loading. Please wait.

Presentation is loading. Please wait.

A probabilistic approach to cognition

Similar presentations


Presentation on theme: "A probabilistic approach to cognition"— Presentation transcript:

1 A probabilistic approach to cognition
The Bayesian Brain A probabilistic approach to cognition

2

3 Background Computational/Theoretical Neuroscience: describe the brain in the language of mathematics (as we do with the universe) On top of giving a better explanation, could perhaps a (shared) mathematical description also bridge the brain/mind dichotomy? Could such a description also relate brains to machines? (Neuroscience in Engineering Departments)? The engineering approach to intelligence: the brain as a machine The brain as a computational device – what are the computations? Marr and his 3 level of studying the mind/brain Lots of data available out there– try to find structure & a causal model of the world (making sense of data) AI: give instructions vs. give data (program vs. train) The Renaissanse of Artificial Intelligence (3 reasons)

4 The Bayesian approach The Bayesian brain: a possible mathematical framework on how the brain works as a probabilistic inference machine (1990’s) Helmholtz (born 1821): the brain as a device making unconscious inferences about the world - infer the state of the world from noisy and incomplete data Bayes’ (born circa 1701) Theorem Probability Theory A system with no memory Conditional probabilities (e.g. married with kids) Probability of A&B being true Venn diagrams (e.g. dice rolling) Diagnosed with the rare disease: The possibility of having rare disease can be checked with a medical test that is correct 90% of the time. If you take the test which then turns out to be positive, what is the probability that you have the disease?

5 Venn diagram

6

7

8 Bayes’ Law/Theorem

9

10

11

12

13

14 Bayes law makes sense & we use it every day to make (probabilistic) inferences:
Is this person holding Protopapas book a PMS student? How old is my mom? (need a prior) Familiar vs. unfamiliar song in the shower Coin flipping (HHHHH vs. HTTHT) Gestalt Psychology The problem of forgetting priors (Tom the plumber/ mathematician) The problem of strong priors Bayes & Freud Examples from visual perception Ball-in-a-box video The rare disease problem (test is 90% correct, disease is 1/100) (9/108) The Monty Hall Example (2/3 win if change) The cookie problem (30v/10c & 20v/20c: take 1, it is v, P it is from box1?) (3/5) m&m (‘95 change G 10->20 & Y 20->14. ’94 & ’96 bags, P that Y is from ‘94?) (20/27) Elvis Presley (P dead twin brother was identical? 1/125 fraternal, 1/300 identical) (5/11)

15

16

17 Bayes law makes sense & we use it every day to make (probabilistic) inferences:
Is this person holding Protopapas book a PMS student? How old am I? (sensory data is enough) How old is my mom? (need a prior) Familiar vs. unfamiliar song in the shower Coin flipping (HHHHH vs. HTTHT) Gestalt Psychology The problem of forgetting priors (Tom the plumber/ mathematician) The problem of strong priors - Bayes & Freud? Examples from visual perception Ball-in-a-box video The rare disease problem (test is 90% correct, disease is 1/100) (9/108) The Monty Hall Example (2/3 win if change) The cookie problem (30v/10c & 20v/20c: take 1, it is v, P it is from box1?) (3/5) m&m (‘95 change G 10->20 & Y 20->14. ’94 & ’96 bags, P that Y is from ‘94?) (20/27) Elvis Presley (P dead twin brother was identical? 1/125 fraternal, 1/300 identical) (5/11)

18 A very strong prior compensates for a weak likelihood (the boy/girl-friend example )

19 Bayes law makes sense & we use it every day to make (probabilistic) inferences:
Is this person holding Protopapas book a PMS student? How old am I? (sensory data is enough) How old is my mom? (need a prior) Familiar vs. unfamiliar song in the shower Coin flipping (HHHHH vs. HTTHT) Gestalt Psychology The problem of forgetting priors (Tom the plumber/ mathematician) The problem of strong priors Bayes & Freud Examples from visual perception Ball-in-a-box video The rare disease problem (test is 90% correct, disease is 1/100) (9/108) The Monty Hall Example (2/3 win if change) The cookie problem (30v/10c & 20v/20c: take 1, it is v, P it is from box1?) (3/5) m&m (‘95 change G 10->20 & Y 20->14. ’94 & ’96 bags, P that Y is from ‘94?) (20/27) Elvis Presley (P dead twin brother was identical? 1/125 fraternal, 1/300 identical) (5/11)

20

21

22

23

24

25

26

27

28 Bayes law makes sense & we use it every day to make (probabilistic) inferences:
Is this person holding Protopapas book a PMS student? How old am I? (sensory data is enough) How old is my mom? (need a prior) Familiar vs. unfamiliar song in the shower Coin flipping (HHHHH vs. HTTHT) Gestalt Psychology The problem of forgetting priors (Tom the plumber/ mathematician) The problem of strong priors Bayes & Freud Examples from visual perception Ball-in-a-box video The rare disease problem (test is 90% correct, disease is 1/100) (9/108) The Monty Hall Example (2/3 win if change) The cookie problem (30v/10c & 20v/20c: take 1, it is v, P it is from box1?) (3/5) m&m (‘95 change G 10->20 & Y 20->14. ’94 & ’96 bags, P that Y is from ‘94?) (20/27) Elvis Presley (P dead twin brother was identical? 1/125 fraternal, 1/300 identical) (5/11)

29 Bayes’ Law/Theorem

30

31

32 Probability vs. Likelihood
The number that is the probability of some observed outcomes given a set of parameter values is regarded as the likelihood of the set of parameter values given the observed outcomes The likelihood is about the stimulus (parameter/hypothesis) – what are the likely stimuli that could give rise to this activation (data)? Likelihood is also referred to as inverse probability, as well as unormalised probability (likelihoods need not add to 1) Probability attaches to possible results whereas likelihood attaches to hypotheses about how these data came around Sometimes people right P(D/H)=L(H/D)

33

34

35

36

37

38

39 In perception, we can view the stimulus as the (inverse) function of the (noisy) sensory input and make probabilistic inferences regarding the former with the aid of priors that we have from experiencing the statistical regularities of the world..

40 Bayesian approach to perception: make inferences from noisy sensory input

41

42

43

44

45

46 Bayesian estimation steps
Maximum Likelihood Estimation (MLE) when we do not have any priors about our hypothesis From that to the Bayesian estimate that takes the priors into account and gives us a posterior distribution Maximum A Posteriori Probability (MAP) estimate is the mode of the posterior distribution The final estimate might also take into account a cost function that biases your estimation towards avoiding cost (similar to SDT )

47

48

49

50

51 If our prior distribution is uniform then all that we have in hand is the likelihood…

52 Predictive coding Our common model of the visual system consists of:
Several stages hierarchically organised from retina on In this feedforward sweep the signal is sensory information that travels bottom-up This information gets transformed into increasing levels of complexity and abstraction at higher processing stages There is also top-down (modulatory?) feedback from the higher to the lower processing stages (there is also functional specialisation) Predictive coding is a Bayesian-inspired alternative: The hierarchical structure remains Higher stages make predictions about the incoming sensory information that travel top-down These are compared with the actual sensory information at each stage What travels bottom-up is the error between the predicted and the actual signal (which also sounds economical ) The brain thus only codes changes/surprise = information (IT) What we see is the result of what we expect to see Perception (veridical) can be seen as “controlled hallucination”! (A. Clark, BBS 2013 review)

53

54

55


Download ppt "A probabilistic approach to cognition"

Similar presentations


Ads by Google