Presentation is loading. Please wait.

Presentation is loading. Please wait.

FUNCTIONALISM HTTPS://WWW.YOUTUBE.COM/WATCH?V=MWMPFJ6B--8.

Similar presentations


Presentation on theme: "FUNCTIONALISM HTTPS://WWW.YOUTUBE.COM/WATCH?V=MWMPFJ6B--8."— Presentation transcript:

1 FUNCTIONALISM HTTPS://WWW.YOUTUBE.COM/WATCH?V=MWMPFJ6B--8

2 “I shall, in short, argue that pain is not a brain state, in the sense of a physical-chemical state of the brain... but another kind of state entirely. I propose the hypothesis that pain, or the state of being in pain, is a functional state of a whole organism” Hilary Putnam Psychological predicates

3 The primary objection to the identity theory is multiple realisability In response to this, physicalists dropped type identity in favour of token identity E.g. each token pain is identical to a token physical states: e.g C-fibre in humans, silicon chips in computers, spaghetti states in aliens

4 There’s a difference between: {an apple: a burnt keyboard: John’s D-fibres firing} and {John’s C-fibres firing: Izlyr’s spaghetti state: Kryten 2X4B-523P’s silicon-chip state} What makes c -fibre firings and silicon states pains but burnt keyboards and D-fibre firings not?

5 Some things are defined in terms of what they’re made of, e.g. water is H20 However, many things are defined in terms of what their functions are E.g the heart is something that pumps blood around the body of an organism: whether it’s made of muscle, or of metal and plastic is irrelevant

6 The property of ‘having the function y’ is a property that can occur in many different physical things For example, ‘being an eye’ is a functional property There are lots of types of eyes that work in different ways and have different physical properties

7

8 FUNCTIONALISM Says that something is a mental state because it has a particular function Mental states are functional states Mental states are real inner states

9 CONSIDER PAIN: It is typically caused by bodily injury it causes distress, a desire to make it go away, a belief about the location of the injury, etc; and it typically causes, wincing, bad language, nursing of the injured area, etc. Any state that plays this role is a pain

10 THREE KINDS OF RELATIONS CONSTITUTE THE ESSENTIAL FEATURES OF A MENTAL STATE: 1) typical ways the environment causes the mental state 2) typical ways the mental state interacts with other mental states 3) typical ways the mental state, in conjunction with other mental states, causes behaviour

11 You don't have to be a physicalist to be a functionalist (Chalmers isn’t) since functionalism defines mental states in terms of their function, not in terms of what they’re made of But most functionalists are physicalists

12 FUNCTIONALISM CAN BE SEEN AS A DEVELOPMENT FROM BEHAVIOURISM Two big problems with behaviourism: 1) Denies that mental states are inner states 2) Circularity problem: behavioural analysis of any mental term will implicitly invoke other mental terms

13 OBJECTION: CIRCULARITY Mental states are defined in terms of their relations to sensory input, behavioural output, and other mental states Functionalism takes into account the relationship between a mental state and other mental states This is circular

14 Our analysis of beliefs will say something about sensory input, behavioural output, and the relation of the belief to other mental states such as desires our analysis of desires will say something about sensory input, behavioural output, and the relation of the desire to other mental states such as beliefs We can’t understand beliefs without invoking desires and we can’t understand desires without invoking beliefs

15 TURING MACHINES (ALAN TURING) consist of an infinitely long tape divided into cells a scanner-printer (head) that reads one cell at a time, can erase what is in the cell, and write something new a finite set of symbols that are written in the cells a finite set of machine states that tell the head what to do when it reads the symbol in a cell

16 HTTPS://WWW.YOUTUBE.COM/WATCH?V=6FEXD0NZAZO HTTPS://WWW.YOUTUBE.COM/WATCH?V=DNRDVLACG5Q HTTPS://WWW.YOUTUBE.COM/WATCH?V=E3KELEMWFHY HTTPS://WWW.YOUTUBE.COM/WATCH?V=FTSAIF9AHN4

17 Head reads the cell and follows the machine state instructions Head erases the symbol Head types in a new symbol as per instructions Head moves one place to the left (in the direction of the arrow) and continues

18 Consider a machine that takes a number and adds ‘1’ to it. The machine's alphabet is ‘0’and ‘1’, and we can represent numbers as collections of 1s: 1=1,11=2,111=3, etc.

19 Turing machines can compute any function for which there is an explicit finite step by step procedure

20 Following the instructions in the table the machine has added a 1 to the 1s in the tape

21 BLOCK’S COLA MACHINE

22

23 Functionalism has a circularity problem: we can’t define mental state M1 without invoking M2, but we can't define M2 without invoking M1 the same is true of Turing Machines The machine state of a TM is defined entirely by its relations to inputs, outputs, and other machine states This circularity in defining the states of a TM doesn't cause any problems (it is not a vicious cycle)

24 MACHINE FUNCTIONALISM (HILARY PUTNAM) The mind is a Turing Machine, and mental states are states of its machine table i.e. the brain is a computer, and the mind is a computer program The brain is the hardware, the mind is the software

25 TMs are multiply realizable Software involved can be implemented in other kinds of hardware Anything that runs the same programme would have the same state

26 RAMSIFICATION (DAVID LEWIS) To Ramsify a sentence, we replace all the terms and phrases that refer to mental states with variables e.g. a, b, c (the backward E is called ‘the existential quantifier in logic, it means ‘there is an “a” such that’) It is put at the front of each variable in a Ramsified sentence this specifies the inputs, outputs, and relations between the internal states without using any mental terms

27 Since, in functionalism, a mental state is defined entirely by its relations to inputs, outputs, and other internal states, the Ramsey sentence replicates the original sentence as a specification of the mental states

28 PAIN Bodily damage and alertness cause pain; and pain causes wincing and distress; and distress causes a desire to be rid of the pain; and the desire to be rid of the pain together with the belief that nursing the damaged area will alleviate the pain causes nursing of the damaged area Ramsifying the above: l is in pain = (Bodily damage and a cause b; and b causes wincing and c; and c causes d; and d together with e causes nursing of the damaged area)

29 To say that Frank is in pain is to say that internal states a, b, c, d, e are related to each other, to inputs, and to outputs and that Frank is in state b

30 Such sentences aren't much use on their own but we can Ramsify our general talk of the mental as a whole and apply it to various platitudes found in folk psychology

31

32 All mental states can be analysed like this This would allow us to define mental states simultaneously but without circularity The network is abstract, physical manifestation is irrelevant, so it is multiply realisable

33 Frank perceives the orange in the kitchen and this causes him to believe that the orange is in the kitchen and this, together with his desire to get the orange, causes him to pick up the orange and this causes him to feel satisfaction

34 PROBLEMS WITH RAMSIFICATION The functionalist defines all mental terms at once in terms of the whole causal network of inputs, outputs, and internal states. when asked e.g. “what is pain?” or “what is desire?”, s/he simply points to one of the nodes in that network So if any clause of the Ramsey sentence is false, the whole sentence is false

35 For example, to say that a dog is in pain equals (Bodily damage and a cause b; and b causes wincing and c; and c causes d; and d together with e causes nursing of the damaged area) and the dog is in b) But e is a belief, and it's implausible to suppose that dogs have beliefs (beliefs are propositional and linguistic)

36 Since all our mental terms are defined together, and since dogs cannot have beliefs, it follows that dogs cannot experience pain Functionalist is victim to chauvinism of type identity theory

37 Only adult humans can be in pain; babies, like dogs, do not literally have beliefs The Ramsey sentence the functionalist use will contain all the platitudes of folk psychology If any of it is false, the sentence as a whole is false

38 The holistic functionalist defines mental terms simultaneously in a single Ramsey sentence The molecular functionalist defines mental terms in independent clusters there are many Ramsey sentences - so something could e.g. fail to have beliefs but still have emotions

39 ACTIVITIES Design a simple Turing machine Construct a complex statement containing folk psychology views about one of the following (being in love, going to the loo [careful], taking an exam) Ramsify it!

40 QUESTIONS Outline Functionalism Explain what a Turing machine is and how it can be used to support functionalism (against claims to circularity and multiple realisibility)

41 CRITICISMS OF FUNCTIONALISM HTTPS://WWW.YOUTUBE.COM/WATCH?V=ZMEK1LQ_WGK

42 QUALIA Qualia are ‘phenomenal properties’ they are what give an experience it’s distinctive quality e.g. ‘what it is like’ to experience redness or to smell a rose We are aware of these properties through consciousness and introspection

43 1. Qualia, by definition, are intrinsic, non-representational properties of conscious mental states. 2. Intrinsic, non-representational properties cannot, by definition, be completely analysed in terms of their causal roles., because causal roles are relational properties, not intrinsic properties 3. Therefore, if qualia exist, some mental properties cannot be analysed in terms of their causal roles.

44 4. Functionalism claims that all mental properties are functional properties which can be completely analysed in terms of their causal roles. 5. Therefore, if qualia exist, functionalism is false. 6. Qualia exist. 7. Therefore, functionalism is false.

45 ABSENT QUALIA The possibility of a functional duplicate with no qualia Suppose we have a complete functional description of your mental states For each and every one of your mental states, we have an input-output analysis (Block calls this a ‘machine table’)

46 In ‘Troubles with functionalism’ Block accuses functionalism of ‘liberalism’ - the tendency to ascribe minds to things that do not have them He outlines two systems which could be functionally equivalent to a human being but without mental states

47 Imagine a body like yours with the head hollowed out and replaced with a set of tiny people who realise the same machine table as you according to the functionalist account, this ‘homunculi- head’ (i.e. head full of people) would have a mind and experiences of pain and intentional states such as beliefs and desires

48 Block argues that such a system would not be minded - there is nothing it is like to be the homunculi-head However no physical mechanism seems very intuitively plausible as a seat of qualia, not even a brain

49 CHINESE BRAIN (NED BLOCK) Imagine the entire nation of China simulates the workings of one brain, so a single Chinese person takes the place of one neurone Each are given two-way radios that connect them to each other, and connect some of them to an artificial body

50 Once we get the inputs, outputs, and relations between internal states right, the whole nation of China will realize the same functional organization as a human brain

51 According to functionalism, this should create a mind; but it is very difficult to believe that there would be a ‘Chinese consciousness’ If the Chinese system replicated the state of my brain when I feel pain, would something be in pain and if so what?

52 The Chinese system, although it duplicates your functioning, can’t duplicate your mind, because some mental states are qualia, and qualia are not functional states

53 This is one version of the absent qualia problem: it seems possible that there could be systems that share our functional organization but that have no qualia and no mental states whatsoever (functional zombies)

54 The claim that Qualia exist can be established by the possibility of a functional duplicate with different qualia if two people can have states with identical functions but different phenomenal properties we have disproved functionalism

55 INVERTED QUALIA This version of this objection is known as the ‘inverted qualia’ or the ‘inverted spectrum’ thought experiment Suppose someone has an inverted experience of colour, his vision seems to work the same way as yours, but where you see a green pigment, he sees red

56 i.e. ‘what it’s like for you to see red’ and ‘what it’s like for him to see green’ are functionally identical (both have the same inputs (grass) and outputs (e.g. saying ‘grass is green’)

57 Although the functionalist would say that you have the same mental states you don’t because his inner experiences are not identical in terms of their intrinsic properties (qualia)

58 The primary response is to claim that if somebody is really functionally equivalent to you, they necessarily have the same qualia as you, so the notion of inverted colour qualia is incoherent

59 mental states are the products of the particular physical states that constitute them, something without the same neurobiology would not be functionally equivalent

60 However this sounds more like type identity theory and means that qualia are not multiply realisable Inverted colour qualia seems to be a serious empirical possibility in pseudonormal vision (see next slide)

61

62 The possibility of spectrum inversion is ruled out by functionalism, yet as it is conceivable, functionalism must be false

63 It is conceivable because qualia have intrinsic qualities, regardless of how they relate to other mental states or to sensory inputs and behavioural outputs

64 If the hypothesis is both irrefutable and unconfirmable we may be inclined to conclude that the idea of inverted qualia is nonsensical

65 We cannot make coherent sense of the supposed difference between you and me if we cannot point to anything in the world that would establish the difference

66 we can modify the thought experiment so that it is specific to functionalism If I was born with normal vision, but had an operation to switch the neural pathways from the optic nerve to the visual cortex my qualia would be inverted

67 I would learn colour the vocabulary and become fully functionally equivalent to you this shows that there is more to qualia than what can be captured by a functionalist account

68 In response to this, functionalists can still claim that if we react in similar and complex ways to the same stimuli and if qualia play the same complex role in relation to other mental states and behaviour, this is all we need to be sure that we are in the same mental state

69 some functionalists concede that in this version of the thought experiment, the intrinsic physical differences would produce a different qualitative feel, and so concede that qualia cannot be given a complete functional definition

70 However, functionalists need not give up on the theory as an account of most of our mental states such as beliefs and desires

71 THE TURING TEST a test of whether an artificial intelligence could be said to have a mind (with beliefs and other intentional states) If a computer could communicate with a human being in such a way that the human being could not tell the difference between conversing with the computer and conversing with another human being, it could be said to have a mind

72 HTTPS://WWW.YOUTUBE.COM/WATCH?V=JLMOTEQMDYC Our ordinary understanding of their basic operations supports the view that computers have minds We say that a computer has ‘memory’, that it processes information, uses language, calculates, obeys commands, follows rules etc.

73 Searle opposes machine functionalism He claims that mental states are essentially natural phenomena in the same way as other biological functions and they need a certain neurophysiology, i.e. a living brain organic brains are required for consciousness no artificial intelligence could be conscious

74 He tries to show that a computer which was functionally equivalent to a human being with respect to linguistic behaviour, and so could pass the Turing test, still wouldn’t be conscious

75 CHINESE ROOM (JOHN SEARLE) It is raining It’s raining Es regnet these sentences are syntactically different but semantically the same Syntax is about the form of symbols and the rules of grammar semantics is about the meaning we construct from those rules

76 SEARLE’S ARGUMENT: (P1) Programs are entirely syntactical (P2) Minds have semantics (P3) Syntax is not sufficient for semantics (C) Minds are not just programs

77 You're locked in a room, with two slots to the outside world marked “in” and “out". In the room, there are boxes of Chinese symbols, and a rule book containing instructions Through the in-slot, people pass you Chinese symbols (in fact, these are coherent Chinese sentences)

78 You look up the symbols in the rule- book, and it tells you which symbols to give back through the out-slot these are perfectly coherent replies

79 https://www.youtube.com/watch?v=RIoTpSfMbiM&index=2&list=PLI0dcyfCSq5CV7m9QwV4C2UKNM85yPn6o https://www.youtube.com/watch?v=lkBq7QAGMAs&index=3&list=PLI0dcyfCSq5CV7m9QwV4C2UKNM85yPn6o

80 THE CHINESE ROOM

81 The Chinese Room is a computer that simulates understanding of Chinese the boxes of symbols are the database the rule book is the program you are the hardware implementing the program (note that the inputs and outputs are the same as if there was somebody in the room who did understand Chinese)

82 Understanding a language requires more than manipulating symbols It needs you to understand the meaning of what you are saying or writing

83 Merely implementing the right program does not in itself generate any semantics You have only simulated understanding of Chinese, not replicated it

84 Searle’s Chinese room argument focuses on intentionality, the feature of our mental states which enables them to be about things He distinguishes different sorts of intentionality as-if intentionality is possessed by things like rivers as they flow towards the sea, this is how we represent the river, but not something possessed by the river itself

85 Intrinsic intentionality is only possessed by minds Although they can pass the ‘Turing test’ computers do not have intrinsic intentionality Like the person in the room, a computer only has as-if intentionality

86 Note: Searle does not argue that it’s impossible for e.g. computers made of silicon chips to have minds All he’s arguing is that they can’t have minds merely in virtue of whatever programs they’re running

87 SYSTEMS RESPONSE The person in the room doesn’t understand Chinese but the person is part of a system (symbols, rule book etc.) and the system as a whole could pass the Turing test We attribute understanding not to the individual (neuron) but to the entire room (brain)

88 SEARLE’S RESPONSE the person in the room doesn’t understand Chinese because s/he doesn’t possess intentionality and has no way to attach meaning to the symbols But if one person has no way to attach meaning to the symbols, the room as a whole has no way to do this

89 Searle suggests an extension to the thought experiment: imagine the person in the room memorizes the database and the rule book s/he goes out and converses with people face-to-face in Chinese but still doesn’t understand Chinese, because all s/he's doing is manipulating symbols

90 INTUITION IS UNRELIABLE Chinese brain thought experiment appeals to our intuition that the system cannot be conscious But they could be wrong, when we are used to talking machines, we may find our intuitions change

91 Stephen Pinker asks us to imagine a race of intelligent aliens with silicon- brains, who cannot believe that humans are conscious because our brains are made of meat.

92 The only basis we have for ascribing minds to others is that they behave appropriately if we are happy to ascribe mentality to other humans we should be happy to do so to a machine capable of behaving in the same way Any other attitude would be chauvinistic

93 Paul & Patricia Churchland parody Searle's argument: P1) Electricity and magnetism are forces: P2) Luminance is a property of light; P3) Forces are not sufficient for luminance; C) Light is not just electromagnetism.

94 THE LUMINOUS ROOM Moving a magnet up and down quickly, generates electromagnetic waves If electromagnetic waves in themselves are sufficient for luminance, this will produce luminance but this is absurd because you can't produce light merely by producing the right forces

95 SYNTAX IS OBSERVER-RELATIVE (JOHN SEARLE) Things like Gravitation, mass, etc. are intrinsic features of the world that would exist whether or not there are any observers. Other things are observer-relative: e.g. it being a nice days for a picnic The things going on in a computer only have a syntax because we assign a syntax to them

96 Consider a wall: it contains billions of molecules all moving in various ways For some of those molecules, there will be a pattern of movements that is functionally identical to the structure of e.g. a word processing program

97 nobody would argue that walls are word processors, because it's impossible in practice for us to use them in that way in practice we can assign syntactical properties to calculators and laptops, so we treat them as computers running programs but in principle we could assign those same syntactical properties to any sufficiently large object

98 SEARLE’S VIEWS 1) Syntax is not sufficient for semantics (Chinese Room) 2) Syntax is observer-relative Nothing has syntactic properties intrinsically the idea that the brain is a computer and the mind a program tells us nothing about how the brain and the mind really work

99 FUNCTIONS ARE OBSERVER-RELATIVE Functionalists say that if something performs the function of a heart, whether it’s made out of muscle or metal and plastic is irrelevant But an artificial heart is not the same as a biological heart An artificial heart is a heart in virtue of our intentions for it; to pump blood around the body we could describe a different functional of the heart e.g. to make a rhythmic sound

100 When we describe the function of something, there is always a normative judgement involved If so, the existence of functions presupposes mentality and so cannot be used to explain it without circularity This also begs the question of whether a defective heart is still a heart

101 The functionalist may say that what makes something a heart is that it was selected to pump blood But appealing to natural selection only works if the trait in question was in fact selected Many biological traits were not selected for but are by-products of things that they were selected for

102 To define something according to its functional role we have to establish that this is what it was selected for Something as complex as brain will have lots of by-products Chomsky claims that our ability to use language is one of them

103 we are in search of some ‘mysterious property’ if we insist the machine must possess supposedly intrinsic intentionality it’s not obvious that any theory of mind is able to explain this feature of consciousness

104 we can deny that anything – computer, Chinese room or human being – possesses intrinsic intentionality, perhaps the only kind of intentionality is the ‘as-if’ kind this is just a way of interpreting and predicting human behaviour that is ultimately reducible to a set of causal relations

105


Download ppt "FUNCTIONALISM HTTPS://WWW.YOUTUBE.COM/WATCH?V=MWMPFJ6B--8."

Similar presentations


Ads by Google