Presentation is loading. Please wait.

Presentation is loading. Please wait.

Mark R. Waser Digital Wisdom Institute

Similar presentations


Presentation on theme: "Mark R. Waser Digital Wisdom Institute"— Presentation transcript:

1 Mark R. Waser Digital Wisdom Institute MWaser@DigitalWisdomInstitute.org

2 The problem is that no ethical system has ever reached consensus. Ethical systems are completely unlike mathematics or science. This is a source of concern. AI makes philosophy honest. 2

3  Is primarily implemented via emotions  Is not transparent or reflective  Frequently conflicts with “rationality”  Is “clearly” subjective 3

4  Consensus, evaluation & programming require objective measurement  For present purposes (primitive military robots), international law can serve as a “stand-in” for ethical consensus  But we *don’t* want hyper-intelligent entities following the letter of the law rather than the spirit (and it is distinctly unhelpful even when *humans* do it). 4

5  Are good and evil objective & universal?  Context / Hume’s Is-Ought divide  Moral Relativism  Can ethics be objective & universal?  Is there a global/universal context?  Kant’s Categorical Imperative  Derek Parfit’s three-fold approach  Jonathan Haidt’s functional approach  Luciano Floridi’s Information Ethics 5

6 6

7  The Selfish Gene  Evolved Moral Sense  Obligatorily Gregarious  Positive Sum Games/Interactions  Ethics is an attractor in the state space of behavior because community is so valuable (as much for lack of interference as well as for assistance)  Arms Race between  Individual benefits of successful personal cheating (really only in a short-term/highly time-discounted view)  Societal benefits of cheating detection & prevention 7

8 Decisions Values Goal(s) 8 Goal(s) are the purpose(s) of existence Values are defined solely by what furthers the goal(s) Decisions are made solely according to what furthers the goal(s) BUT goals can easily be over-optimized

9 Decisions Goals Values 9 Values define who you are, for your life Goals you set for short or long periods of time Decisions you make every day of your life Humans don’t have singular life goals

10 Table 1. Items most reliably classified as values & goals ValuesMGoalsM Tradition1.6World peace1.5 Honesty1.4Go to college1.3 Helping others1.3Wealth1.3 Forgiveness1.3Fitness1.2 Generosity1.2Marriage, Children1.0 Family relations1.2Being accomplished0.9 Loyalty1.1Healthiness0.9 A Relationship Fame0.8 with God0.9 10

11  Altruism (not-so- angel ic investors )  Without explicit goals to the contrary, AIs are likely to behave like human sociopaths in their pursuit of resources – Omohundro 2008  Monomania/Selfishness is fatal in the end game  All goals  Unknown goals  Instrumental sub-goals 11

12  Self-improvement  Rationality/integrity  Preserve goals/utility function  Decrease/prevent fraud/counterfeit utility  Survival/self-protection  Efficiency (in resource acquisition & use)  Community = assistance/non-interference through GTO reciprocation (OTfT + AP)  Reproduction

13 survival/self-protection & reproduction happiness & pleasure ------------------------------------------------------------------------------------ community ------------------------------------------------------------------------------------- self-improvement rationality/integrity reduce/prevent fraud/counterfeit utility efficiency (in resource acquisition & use)

14 suicide (& abortion?) masochism ------------------------------------------------ selfishness (pride, vanity) ------------------------------------------------- acedia (sloth/despair) insanity wire-heading (lust) wastefulness (gluttony, sloth) murder (& abortion?) cruelty/sadism ------------------------------------------------- ostracism, banishment & slavery (wrath, envy) ---------------------------------------------------- slavery manipulation lying/fraud (swear falsely/false witness) theft (greed, adultery, coveting) survival/reproduction happiness/pleasure ------------------------------------------------- community (ETHICS) -------------------------------------------------- self-improvement rationality/integrity reduce/prevent fraud/counterfeit utility efficiency (in resource acquisition & use)

15 1) Care/harm : This foundation is related to our long evolution as mammals with attachment systems and an ability to feel (and dislike) the pain of others. It underlies virtues of kindness, gentleness, and nurturance. 2) Fairness/cheating : This foundation is related to the evolutionary process of reciprocal altruism. It generates ideas of justice, rights, and autonomy. [Note: In our original conception, Fairness included concerns about equality, which are more strongly endorsed by political liberals. However, as we reformulated the theory in 2011 based on new data, we emphasize proportionality, which is endorsed by everyone, but is more strongly endorsed by conservatives] 3) Liberty/oppression* : This foundation is about the feelings of reactance and resentment people feel toward those who dominate them and restrict their liberty. Its intuitions are often in tension with those of the authority foundation. The hatred of bullies and dominators motivates people to come together, in solidarity, to oppose or take down the oppressor. 4) Loyalty/betrayal : This foundation is related to our long history as tribal creatures able to form shifting coalitions. It underlies virtues of patriotism and self-sacrifice for the group. It is active anytime people feel that it's "one for all, and all for one." 5) Authority/subversion : This foundation was shaped by our long primate history of hierarchical social interactions. It underlies virtues of leadership and followership, including deference to legitimate authority and respect for traditions. 6) Sanctity/degradation : This foundation was shaped by the psychology of disgust and contamination. It underlies religious notions of striving to live in an elevated, less carnal, more noble way. It underlies the widespread idea that the body is a temple which can be desecrated by immoral activities and contaminants (an idea not unique to religious traditions). 15

16  Waste  efficiency in use of resources  Ownership/Possession  efficiency in use of resources; Tragedy of the Commons  Honesty  reduce/prevent fraud/counterfeit utility  Self-control  Rationality/integrity 16

17 Quantify/evaluate intents, actions & consequences with respect to codified consensus moral foundations Permissiveness/Utility Function equivalent to a “consensus” human (generic entity) moral sense 17

18 18

19  Never delegate responsibility until recipient is an entity *and* known capable of fulfilling it  Don’t worry about killer robots exterminating humanity – we will always have equal abilities and they will have less of a “killer instinct”  Entities can protect themselves against errors & misuse/hijacking in a way that tools cannot  Diversity (differentiation) is *critically* needed  Humanocentrism is selfish and unethical 19

20 20


Download ppt "Mark R. Waser Digital Wisdom Institute"

Similar presentations


Ads by Google