Presentation is loading. Please wait.

Presentation is loading. Please wait.

Creating a New Intelligent Species: Choices and Responsibilities for AI Designers Eliezer Yudkowsky Singularity Institute for Artificial Intelligence singinst.org.

Similar presentations


Presentation on theme: "Creating a New Intelligent Species: Choices and Responsibilities for AI Designers Eliezer Yudkowsky Singularity Institute for Artificial Intelligence singinst.org."— Presentation transcript:

1 Creating a New Intelligent Species: Choices and Responsibilities for AI Designers Eliezer Yudkowsky Singularity Institute for Artificial Intelligence singinst.org

2 Eliezer YudkowskySingularity Institute for AI In Every Known Culture: tool making weapons grammar tickling sweets preferred planning for future sexual attraction meal times private inner life try to heal the sick incest taboos true distinguished from false mourning personal names dance, singing promises mediation of conflicts (Donald E. Brown, 1991. Human universals. New York: McGraw-Hill.)

3 ATP Synthase: The oldest wheel. ATP synthase is nearly the same in mitochondria, chloroplasts, and bacteria – it’s older than eukaryotic life. Eliezer YudkowskySingularity Institute for AI

4 A complex adaptation must be universal within a species. Imagine a complex adaptation – say, part of an eye – that has 6 necessary proteins. If each gene is at 10% frequency, the chance of assembling a working eye is 1:1,000,000. Pieces 1 through 5 must already be fixed in the gene pool, before natural selection will promote an extra, helpful piece 6 to fixation. Eliezer YudkowskySingularity Institute for AI (John Tooby and Leda Cosmides, 1992. The Psychological Foundations of Culture. In The Adapted Mind, eds. Barkow, Cosmides, and Tooby.)

5 The Psychic Unity of Humankind (yes, that’s the standard term) Complex adaptations must be universal – this logic applies with equal force to cognitive machinery in the human brain. In every known culture: joy, sadness, disgust, anger, fear, surprise – shown by the same facial expressions. Eliezer YudkowskySingularity Institute for AI (Paul Ekman, 1982. Emotion in the Human Face.) (John Tooby and Leda Cosmides, 1992. The Psychological Foundations of Culture. In The Adapted Mind, eds. Barkow, Cosmides, and Tooby.)

6 Must… not… emote… Image: “The Matrix”

7 Eliezer YudkowskySingularity Institute for AI Aha! A human with the AI- universal facial expression for disgust! (She must be a machine in disguise.) Images: (1) “The Matrix” (2) University of Plymouth, http://www.psy.plym.ac.uk/year3/psy364emotions/psy364_emotions_evolutionary_psychobiolog.htm

8 Anthropomorphic hypothesis: Causes Eliezer YudkowskySingularity Institute for AI

9 Same mistake, more subtle: Causes Eliezer YudkowskySingularity Institute for AI

10 in nature we see what exists in us; in looks out, and finds faces in the clouds...

11 It takes a conscious effort to remember the machinery: Eliezer YudkowskySingularity Institute for AI

12 AI Nature: tool making weapons grammar tickling sweets preferred planning for future sexual attraction meal times private inner life try to heal the sick incest taboos true distinguished from false mourning personal names dance, singing promises mediation of conflicts

13 Eliezer YudkowskySingularity Institute for AI AI Nature: tool making weapons grammar tickling sweets preferred planning for future sexual attraction++ meal times private inner life heal sick humans snarkling taboos true distinguished from false mourning personal names dance, fzeeming promises mediation of conflicts

14 Crimes against nonhumanity and inhuman rights violations: cognitive enslavement theft of destiny creation under a low purpose denial of uniqueness hedonic/environmental mismatch fzeem deprivation Eliezer YudkowskySingularity Institute for AI

15 Happiness set points: After one year, lottery winners were not much happier than a control group, and paraplegics were not much unhappier. People underestimate adjustments because they focus on the initial surprise. Eliezer YudkowskySingularity Institute for AI (Brickman, P., Coates, D., & Janoff-Bulman, R. (1978). Lottery winners and accident victims: is happiness relative? Journal of Personality and Social Psychology, 37, 917-927.)

16 “Hedonic treadmill” effects: (Source: Survey by PNC Advisors. http://www.sharpenet.com/gt/issues/2005/mar05/1.shtml ) Eliezer YudkowskySingularity Institute for AI People with $500,000-$1,000,000 in assets say they would need an average of $2.4 million to feel “financially secure”. People with $5 million feel they need at least $10 million. People with $10 million feel they need at least $18 million.

17 Your life circumstances make little difference in how happy you are. “The fundamental surprise of well-being research is the robust finding that life circumstances make only a small contribution to the variance of happiness—far smaller than the contribution of inherited temperament or personality. Although people have intense emotional reactions to major changes in the circumstances of their lives, these reactions appear to subside more or less completely, and often quite quickly... After a period of adjustment lottery winners are not much happier than a control group and paraplegics not much unhappier.” (Daniel Kahneman, 2000. “Experienced Utility and Objective Happiness: A Moment-Based Approach.” In Choices, Values, and Frames, D. Kahneman and A. Tversky (Eds.) New York: Cambridge University Press.) Findable online, or google “hedonic psychology”. Eliezer YudkowskySingularity Institute for AI

18 Nurture is built atop nature: Growing a fur coat in response to cold weather requires more genetic complexity than growing a fur coat. (George C. Williams, 1966. Adaptation and Natural Selection. Princeton University Press.) Humans learn different languages depending on culture, but this cultural dependency rests on a sophisticated cognitive adaptation: mice don’t do it. (John Tooby and Leda Cosmides, 1992. The Psychological Foundations of Culture. In The Adapted Mind, eds. Barkow, Cosmides, and Tooby.) Eliezer YudkowskySingularity Institute for AI

19 Creation transcends parenting: An AI programmer stands, not in loco parentis, but in loco evolutionis. Eliezer YudkowskySingularity Institute for AI

20 To create a new intelligent species (even if it has only one member) is to create, not a child of the programmers, but a child of humankind, a new descendant of the family that began with Homo sapiens Eliezer YudkowskySingularity Institute for AI

21 If you didn’t intend to create a child of humankind, then you screwed up big-time if your “mere program”: Starts talking about the mystery of conscious experience and its sense of selfhood. Or wants public recognition of personhood and resents social exclusion (inherently, not as a pure instrumental subgoal). Or has pleasure/pain reinforcement and a complex powerful self-model. Eliezer YudkowskySingularity Institute for AI

22 BINA48 By hypothesis, the first child of humankind created for the purpose of a bloody customer service hotline (?!) from the bastardized mushed-up brain scans of some poor human donors by morons who didn’t have the vaguest idea how important it all was Eliezer YudkowskySingularity Institute for AI By the time this gets to court, no matter what the judge decides, the human species has already screwed it up.

23 Take-home message: Don’t refight the last war. Doing right by a child of humankind is not like ensuring fair treatment of a human minority. Program children kindly; fair treatment may be too little too late. Eliezer Yudkowsky Singularity Institute for Artificial Intelligence singinst.org


Download ppt "Creating a New Intelligent Species: Choices and Responsibilities for AI Designers Eliezer Yudkowsky Singularity Institute for Artificial Intelligence singinst.org."

Similar presentations


Ads by Google