Presentation is loading. Please wait.

Presentation is loading. Please wait.

Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December.

Similar presentations


Presentation on theme: "Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December."— Presentation transcript:

1 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December 2008 William H. Hsu Department of Computing and Information Sciences, KSU KSOL course page: http://snipurl.com/v9v3http://snipurl.com/v9v3 Course web site: http://www.kddresearch.org/Courses/Fall-2008/CIS730http://www.kddresearch.org/Courses/Fall-2008/CIS730 Instructor home page: http://www.cis.ksu.edu/~bhsuhttp://www.cis.ksu.edu/~bhsu Reading for Next Class: Chapters 1-14, 22 – 23, 26, Russell & Norvig 2 nd edition Philosophy of Mind Discussion: Final Exam Review

2 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence PHILOSOPHY OF MIND © 2006 Hilary Greaves http://www.rci.rutgers.edu/~hgreaves/teaching/phil103/lectures.htm

3 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence The Two Central Problems in the Philosophy of Mind The Problem of Other Minds  How can I know that other minds exist? Vs. Thoughts Feelings Sensory experiences etc.

4 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence The Problem of Other Minds (cont’d)  How can I know what is going on in other minds? Afraid Looking forward to next summer’s holiday Vs.

5 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence The Mind – Body Problem How are minds and their contents related to the physical, chemical & biological world? MIN D

6 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Two (apparently unique) aspects of mental phenomena Consciousness  Your mind is conscious. But as far as we know, ordinary bits of physical matter are not conscious.  “What is consciousness?”  You know! CONSCIOUS -NESS

7 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Intentionality (or aboutness) Mental states can be about other things in the world.  Your thought that Bush is a jerk is about George Bush. Bush is a jerk aboutness

8 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Intentionality/aboutness Ordinary physical objects aren’t about anything.  The desk is not about the chair. aboutness

9 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Intentionality/aboutness (A painting can be about something... aboutness

10 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Intentionality/aboutness... But that’s only because it was painted by someone who was thinking about the thing he was painting.) aboutness

11 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Terminological note ‘Intentionality’ has no special connection to intentions.  Bush’s knowledge that Saddam Hussein is alive is about Saddam.  Tom’s fear of the dentist is about dentists.  My desire that Australia will be fun is about Australia.  Your intention to eat dinner this evening is about your dinner. Mental states that have ‘aboutness’ (=‘intentionality’)

12 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Intentionality/aboutness The challenge: an adequate explanation of what minds are should explain how mental goings-on can be about things, since ordinary physical goings-on do not have this feature.

13 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Why explaining ‘aboutness’ is difficult thoughts about things that exist  There's no such thing as 'Intentional string'.  The asymmetry problem: Aboutness can't be resemblance Bush is a jerk aboutness

14 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence An attempt to explain what ‘aboutness’ is My thought is about Bush = my thought was ‘caused in the right sort of way’ by Bush. Bush is a jerk aboutness Causal chain

15 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Why the causal account doesn’t work We can also think about things that don’t exist. It would be fun to ride a unicorn. Causal chain??

16 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Cartesian Dualism Minds and Matter are two fundamentally different kinds of “substances” Thoughts Feelings Sensory experiences etc. Physical stuff (matter) Mental stuff (minds) Consciousness Intentionality Thought

17 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Problems with Cartesian Dualism (I) It's mysterious what the "mind substance" is.  What it’s not:  Not made of matter  Not located in space  Not ‘extended’ ...

18 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Problems with Cartesian Dualism (II) It’s a mystery how minds can interact causally with things made of matter.  Mental causing physical: I’m going to raise my arm. Causes Mental substance (mind) Matter

19 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Problems with Cartesian Dualism (II)  Physical causing mental: Sound experienc e sound waves Causes

20 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Problems with Cartesian Dualism (III) Are dualism and gradual evolution consistent? ? ?

21 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Problems with Cartesian Dualism (IV) It makes the Other Minds problem very hard to solve. Vs. Thoughts Feelings Sensory experiences etc.

22 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence “Materialist” solutions to the Mind – Body Problem Materialism: The claim that everything in the universe is made up of matter.

23 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence The advance of materialism The advance of science has made materialism look very plausible to many philosophers & scientists.  Astronomy: The heavenly bodies are made of matter and obey the laws of physics. earth “heavens”

24 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence The advance of materialism  Evolutionary biology: No non-material processes or forces (e.g. God) are needed to explain the design in the biological world. Spooky stuff?

25 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence The advance of materialism  Advances in physiology & understanding the genetic code: There is no need for a “life force” ("elan vital"). “life force” Live tiger Dead tiger

26 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence The advance of materialism Advances in understanding the way the brain works – helped by imaging technology – makes it increasingly plausible to believe that mental phenomena like thought and consciousness might be explained in terms of brain processes Spooky stuff??

27 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence The “Reductive Materialism” hypothesis The “Reductive Materialism” hypothesis:  social sciences can be reduced to (i.e. explained by appeal to) psychology  psychology can be reduced to biology  biology can be reduced to chemistry  chemistry can be reduced to physics

28 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Behaviorism The way we tell that (e.g.) you're in pain is by observing your behavior. The inspiration for behaviorism: maybe what it means to say you're in pain also involves your behavior.

29 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence ‘Behaviorism’ in psychology Methodological (or: psychological) behaviorists included B. F. Skinner  Methodological behaviorism: If there is anything more to mental states than dispositions to behave in certain ways, that 'extra bit' has nothing to do with science. I am feeling angry

30 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence ‘Behaviorism’ in philosophy Analytical (or: philosophical) behaviorists included Ludwig Wittgenstein and Gilbert Ryle: there is nothing more. I am feeling angry EQUALS

31 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Analytical behaviorism Analytical behaviorism: Claims about mental states and processes can be “translated” into claims about patterns of behavior. Example :  “Tom has a toothache” = “Tom moans; Tom says his tooth hurts; if someone touches Tom’s tooth, Tom screams; etc."

32 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Analytical behaviorism “Jenny is hungry” = ?? “Jason wishes he could quit school” = ??

33 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Behaviorism and the Problem of Other Minds I am feeling angry EQUALS

34 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Problems for behaviorism (I): Undefinable mental states Some mental states don't seem to be definable in this way.  Listening to Bob Dylan  Thinking about how big the universe is

35 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Problems for behaviorism (II): Circular definitions Behaviorist “definitions” of mental state terms all turned out to be circular (or just plain wrong). ‘Tom believes it will rain today’ = ‘Tom will either stay at home or drive to school today’ IF Tom wants to stay dry, and Tom doesn’t believe there’s a shelter at the bus stop, and...

36 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Problems for behaviorism (II): Circular definitions “James believes the exam will be hard” =....??

37 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Problems for behaviorism (III): The 'inverted spectrum' problem It seems possible for one person to have their spectrum of color experiences 'inverted' relative to another's. But according to behaviorism, this is not possible. The flower looks yellow

38 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Problems for behaviorism (IV): Some wrong predictions For behaviorists, two people who behaved in just the same way would have the same mental states. But there are cases in which this is clearly crazy.  Dennett’s thought experiment: curare plus “amnestic” “How did it feel?” Administer general anaesthetic operation

39 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Problems for behaviorism (IV): Some wrong predictions “Curare”: paralyses all voluntary muscles “Amnestic”: Has no effect until 2 hours after ingestion, whereupon it wipes out memory of those two hours “How did it feel?” Administer curare + amnestic operation

40 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence... The problem (an argument against behaviorism): (P1) Behaviorism predicts that the patient who is given general anaesthetic has the same experiences as the patient who is given curare + amnestic. (P2) But that’s wrong! One is unconscious, and the other is in excruciating pain. Therefore, (C) Behaviorism is false.

41 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence The (Type-Type) Identity Theory The type – token distinction the Greaves's belief that snow is white & your belief that snow is white. Snow is white

42 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence The (Type-Type) Identity Theory The type-type identity theory: Mental state types are identical with brain state types Examples of mental state types:  the belief that snow is white  a burning pain in the index finger  the thought that 17 is a prime number Example of a type-type identification:  "Pain is c-fibers firing" pain Same event (type)

43 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence How the theory deals with the Other Minds Problem..??

44 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Problem for the (Type) Identity Theory: “Chauvinism” about the mental According to type-type identity theory:  Animals with brains significantly different from ours can’t feel pain or have other mental states  If there are organisms in the universe whose chemical composition is different from ours, they can’t feel pain or pleasure, and they can’t think.  Extra-terrestrials can’t even think about math. pain 7x5=35

45 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Functionalism (aka “machine functionalism”) The emergence of computers and the computer model of the mind.  Computers are symbol manipulators  Programs specify how the symbols are to be manipulated. ADDING PROGRAM 2 3 5

46 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Programs and physical devices Many different sorts of physical devices can run the same program...

47 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Minds and programs Minds are to brains as programs are to computers  “The mind is the brain’s program” Inputs (mouse clicks, keyboard strokes) Outputs (screen displays; printouts) Inputs (light rays; hammers hitting thumb) Outputs (body movements; screams; sentences spoken) PROGRAM

48 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence ‘Functional states’ Mental state concepts are functional concepts; mental states are functional states (not physical states) ‘Functional states’ for the adding program: Input 2Input 3 Output 5 Computer state: ‘remember that a 2 has been input, and get ready to add a 2 nd number’ Computer state: ‘ready to do an add calculation’ Functional states

49 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence ‘Functional states’ cont’d Desire to stay dry Belief that there is no shelter at the bus stop Input: See that it’s raining Belief that it’s raining Output: Look for car keys Functional states

50 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence What is... The functionalist solution to the Other Minds Problem? The functionalist solution to the chauvinism problem? pain ? ?

51 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Strong AI A radical (?) implication of functionalism: “Strong AI” -- It is possible to build artificial minds with real mental states.  A computer running the same program that your brain is running would have the same mental states that you have.  It would be conscious, and thus feel pains and pleasures, have emotions, etc.  It would have thoughts with real intentionality.

52 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Can machines think? If they can, then the fact that functionalism predicts that they can counts in favor of the functionalist theory. If not, it counts as an objection to functionalism. The Turing test – if a machine passes the Turing test, we cannot tell that it isn't really thinking  It is a further step to say that if a machine passes the Turing test, then it is thinking. But perhaps (?) this extra step is very plausible. “What do you think about Saddam’s trial?” “I don’t normally approve of the death penalty, but this guy deserves everything he gets.”

53 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Searle’s Critique of Functionalism and Strong AI Roger Schank’s project: Getting a computer to understand stories and answer questions about them the way people do. A man went into a restaurant and ordered a hamburger. When the hamburger arrived it was burnt to a crisp, and the man stormed out of the restaurant angrily, without paying for the burger or leaving a tip. Did the man eat the hamburger? No

54 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence What Schank’s computer is doing It’s manipulating symbols. Does this mean that it understands what the symbols mean (i.e. understands the story, and understands its replies to the questions)? Thinking of a restaurant scene aboutness

55 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Searle’s argument Understanding is an ‘intentional state’ – when you understand a story, you understand what it is about. Searle’s going to argue that the computer could be manipulating symbols in all the right ways, without understanding the story (= without having ‘intentionality’). The "Chinese room argument": An argument that passing the Turing test is not sufficient or thinking.

56 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence The setup of the "Chinese room” Input:Output:

57 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Searle’s argument (P1) Neither Searle nor any other part of the Chinese Room really understands Chinese. Therefore, (C1) The Chinese Room [i.e. the system] does not understand Chinese. (From (P1)) (P2) But the Chinese Room perfectly simulates someone who does understand Chinese. Therefore, (C2) Simulating understanding is not sufficient for having understanding. (From (P2), (C1)) Therefore, (C3) Even if Schank's computer perfectly simulates human understanding of stories, it does not follow that Schank's computer really understands stories.

58 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Searle’s Account of Intentionality It is a “causal product” of the right kind of biological system. ADDING PROGRAM 2 3 5 adding Not a “causal product” of the symbol manipulation

59 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Searle’s Account of Intentionality Pick up brick and throw at window Window breaks Causal product of brick being thrown

60 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Searle’s Account of Intentionality  It cannot be created simply by symbol manipulation. Searle makes the same claims for consciousness. intentionalit y ???

61 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence A Problem for Searle’s View Either intentionality and consciousness are restricted to brains like ours  in which case he is committed to chauvinism Or brains quite different from ours can also produce intentionality & consciousness  in which case the Other Minds Problem looks to be unsolvable since we can’t tell which brains just simulate consciousness & intentionality & which really have it.

62 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Some questions for YOU to ponder Can a suitably sophisticated computer which is NOT made out of “meat” like the human brain  have real intentionality (and thus real thoughts)?  have real consciousness (feel real pain & pleasure, and know what it is like to experience colors & tastes)? Does it matter? If so, why?

63 Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Some questions for YOU to ponder Can an extra-terrestrial with a suitably sophisticated brain that is very different from our brain  have real intentionality (and thus real thoughts)?  have real consciousness (feel real pain & pleasure, and know what it is like to experience colors & tastes)? How can we KNOW whether the computer or the extra-terrestrial has REAL consciousness and REAL intentionality?


Download ppt "Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December."

Similar presentations


Ads by Google