Presentation is loading. Please wait.

Presentation is loading. Please wait.

AI and NLP German Rigau i Claramunt IXA group Departamento de Lenguajes y Sistemas Informáticos UPV/EHU.

Similar presentations


Presentation on theme: "AI and NLP German Rigau i Claramunt IXA group Departamento de Lenguajes y Sistemas Informáticos UPV/EHU."— Presentation transcript:

1 AI and NLP German Rigau i Claramunt IXA group Departamento de Lenguajes y Sistemas Informáticos UPV/EHU

2 AI and NLP

3

4

5  Current AI challanges  Natural Language Understanding  Image/video Understanding  Process/agents/services Understanding  Database Understanding  Web Understanding ...

6 AI and NLP

7

8

9

10  A Machine Translation tale  The spirit is willing, but the flesh is weak  The vodka is strong, but the meat is rotten

11 AI and NLP

12

13

14 CIRCA Technology Ontology with millions of words, meanings and and relationships Created by a team of experienced lexicographers and computational linguists Automatic expansion coupled with manual quality supervision Includes slang, pop, and nonstandard usages Language independent Ranks among the most comprehensive lexicons in the world

15 AI and NLP

16

17

18 EFE scenario

19 AI and NLP News Article 10 TOPIC = TERRORISMO CONTEXT = Sigue la violencia en Colombia y especialmente en Medellín. GOAL = Un entierro en Medellín. QUERY = entierro medellín TEXT = sepelio medellín RESULT = FH_ RESULT = FH_ = MEANING: WP8 User validation

20 AI and NLP MEANING: WP8 User validation: CLIR

21 AI and NLP MEANING: WP8 User validation: CLIR

22 AI and NLP

23 23 AI and NLP Setting  From Cyc  Fred saw the plane flying over Zurich.  Fred saw the mountains flying over Zurich.

24 AI and NLP24  From NLP to NLU  Large-scale Semantic Processing dealing with concepts (senses) rather than words  Two complementary problems:  Acquisition bottleneck  Autonomous large-scale knowledge acquisition systems  Ambiguity  Highly accurate and robust semantic systems AI and NLP Setting

25 AI and NLP  Which Knowledge is needed by a concrete NLP system?  Where is this Knowledge located?  Which automatic procedures can be applied? AI and NLP Introduction

26 AI and NLP  Which Knowledge is needed by a concrete NLP system?  Phonology: phonemes, stress, etc.  Morphology: POS, etc.  Syntactic:category, subcat., etc.  Semantic:class, SRs, etc.  Pragmatic:usage, registers, TDs, etc.  Translations:translation links AI and NLP Introduction

27 AI and NLP  Where is this Knowledge located?  Human brain  Structured Lexical Resources:  Monolingual and bilingual MRDs  Thesauri  Unstructured Lexical Resources:  Monolingual and bilingual Corpora  Mixing resources AI and NLP Introduction

28 AI and NLP  Which automatic procedures can be applied?  Prescriptive approach  Machine-aided manual construction  Descriptive approach  Automatic acquisition from pre-existing Lexical Resources  Mixed approach AI and NLP Introduction

29 AI and NLP jardín_1_1Terreno donde se cultivan plantas y flores ornamentales. florero_1_4 Maceta con flores. ramo_1_3 Conjunto natural o artificial de flores, ramas o hierbas. pétalo_1_1Hoja que forma la corola de la flor. tálamo_1_3Receptáculo de la flor. miel_1_1 Substancia viscosa y muy dulce que elaboran las abejas, en una distensión del esófago, con el jugo de las flores y luego depositan en las celdillas de sus panales. florería_1_1 Floristería; tienda o puesto donde se venden flores. florista_1_1 Persona que tiene por oficio hacer o vender flores. camelia_1_1 Arbusto cameliáceo de jardín, originario de Oriente, de hojas perennes y lustrosas, y flores grandes, blancas, rojas o rosadas (Camellia japonica). camelia_1_2 Flor de este arbusto. rosa_1_1 Flor del rosal. AI and NLP MRDs and Semantic Knowledge

30 AI and NLP30  Dealing with ambiguity  Morphosyntactic: Yo bajo con el hombre bajo a tocar el bajo bajo la escalera.  Syntactic: Zapatos de piel de señora  Semantic: Baja al banco y espérame allí Se dieron pasteles a los niños ... Basic NLP levels Introduction

31 AI and NLP31 Morphological Analysis  Setting  Systems  Morpholexical relationships (Octavio Santana)  Freeling (Lluís Padró) ...

32 AI and NLP32 Morphological Analysis  Morphology deals with the orthographic form of the words  Morphological processes  Inflection: prefixes + root + suffixes (root, lemma, form)  Derivation: change of category  Multi-word expressions: compounds, idioms, phrasal verbs,...  Grammatical categories, parts-of-speech  Open categories and closed (functional) categories  Lexicon  POS tags

33 AI and NLP33 Morphological Analysis  Main Parts-of-Speech  Open class words  Noun: common noun, proper noun (gender, number,...)  Adjective: atributive, comparative...  Verb: (number, person, mode, tense), auxiliary verbs  Adverb: place, time, manner, degree,...  Closed class words  Pronoun: nominative, accusative,... (anaphora)  Determiner: articles, demonstratives, quantifiers...  Preposition:  Conjunction:

34 AI and NLP34

35 AI and NLP

36 36 Named Entity Recognition and Classification  Setting  CONLL'02, CONLL'03 shared tasks  Systems  Cognitive Computation Group (Dan Roth)  Freeling ...

37 AI and NLP37 Named Entity Recognition and Classification (NERC) Setting  Named Entity Recognition (NER) is a subtask of Information Extraction. Different NER systems were evaluated as a part of the Sixth Message Understanding Conference in 1995 (MUC6)  Named entities are phrases that contain the names of persons, organizations, locations, times and quantities. [ORG U.N. ] official [PER Ekeus ] heads for [LOC Baghdad ].

38 AI and NLP38 Parsing (Syntactic Analysis)  Setting  PARSEVAL  Systems  RASP (John Carroll & Ted Briscoe)  Minipar (Dekang Lin)  VISL (Eckhard Bick)  Freeling ...

39 AI and NLP39 Parsing (Syntactic Analysis)  Syntax and grammar  Phrase structure  Word order  Syntagma, phrase, constituent  NP, VP, AP, head, relative clause  Grammars  Syntax vs. lexicon  Coverage: complete, partial...  Chunking, clausing,...  Context-free grammars  Terminals, no terminals, parse trees, recursivity  Non-local dependencies The woman who found the wallet were given a reward

40 AI and NLP40 Word Sense Disambiguation  Setting  WSD Tutorial (Ted Pedersen & Rada Mihalcea 05)  SENSEVAL 1, 2 and 3...  Systems  Knowledge-based WSD  Conceptual Distance (Ted Pedersen)  Structural Semantic Interconections (Roberto Navigli)  Corpus-based WSD  GAMBL (Walter Daelemans) ...

41 AI and NLP41 Word Sense Disambiguation Setting  WSD is the problem of assigning the appropriate meaning (sense) to a given word in a text  “WSD is perhaps the great open problem at the lexical level of NLP” (Resnik & Yarowsky 97)  WSD resolution would allow :  acquisition of knowledge: SCF, Selectional Preferences, Predicate Models, etc.  improve existing Parsing, IR, IE  Machine Translation  Natural Language Understanding ...

42 AI and NLP42 Word Sense Disambiguation Setting  WSD is the problem of assigning the appropriate meaning (sense) to a given word in a text  “WSD is perhaps the great open problem at the lexical level of NLP” (Resnik & Yarowsky 97)  WSD resolution would allow :  acquisition of knowledge: SCF, Selectional Preferences, Predicate Models, etc.  improve existing Parsing, IR, IE  Machine Translation  Natural Language Understanding ...

43 AI and NLP43  From Financial Times GM’s drive to make Saturn a star again Word Sense Disambiguation Setting

44 AI and NLP44  From Financial Times GM’s drive to make Saturn a star again car manufacturer, car maker, carmaker_1, auto manufacturer, auto maker, automaker -- a business engaged in the manufacture of automobiles campaign, cause, crusade, drive_3, movement, effort -- a series of actions advancing a principle or tending toward a particular end car_1, auto, automobile, machine, motorcar -- 4-wheeled motor vehicle; usually propelled by an internal combustion engine; "he needs a car to get to work" star_5, principal, lead -- an actor who plays a principal role star_1 -- ((astronomy) a celestial body of hot gases that radiates energy derived from thermonuclear reactions in the interior fig no person Word Sense Disambiguation Setting

45 AI and NLP45 Word Sense Disambiguation Setting  Knowledge-Driven WSD  knowledge-based WSD  No Training Process (~ unsupervised)  Large scale lexical knowledge resources  WordNet, MRDs, Thesaurus,...  100% coverage  55% accuracy (SensEval)

46 AI and NLP46 Word Sense Disambiguation Setting  Corpus-Driven WSD  statistical-based WSD  Machine-Learning WSD  Training Process (~ supervised)  learning from sense annotated corpora  (Ng 97) effort of 16 man/year per year per language  no full coverage  70% accuracy (SensEval)

47 AI and NLP47 Semantic Role Labeling  Setting  SRL Tutorial (Lluís Màrquez 05)  CONLL'05 shared task  Systems  Knowledge-based WSD  Corpus-based WSD  Memory-based Shallow Parser (Walter Daelemans)  Cognitive Computation Group (Dan Roth)  Center for Spoken Language Research (Colorado) ...

48 AI and NLP48 Semantic Role Labeling Setting  SRL is the problem of recognizing and labeling semantic roles of a predicate  A semantic role in language is the relationship that a syntactic constituent has with a predicate.  Typical semantic arguments include:  Agent, Patient, Instrument, etc.  and also adjunctive arguments:  Locative, Temporal, Manner, Cause, etc.  Useful for answering "Who", "When", "What", "Where", "Why", etc.  IE, QA, Summarization and Semantic Interpretation

49 AI and NLP49 Semantic Role Labeling Setting  From PropBank [A0 He ] [AM-MOD would ] [AM-NEG n't ] [V accept ][A1 anything of value ] from [A2 those he was writing about ].  Roleset  V: verb  A0: acceptor  A1: thing accepted  A2: accepted-from  A3: attribute  AM-MOD: modal  AM-NEG: negation

50 AI and NLP German was hungry. He opened the refrigerator.  hungry (feeling a need or desire to eat)  eat (take in solid food)  refrigerator (an appliance in which foods can be stored at low temperature) WordNet Text Inferences

51 AI and NLP WordNet & EuroWordNet WordNet  Princeton University (Miller et al. 1990, Fellbaum 98)  Lexicalised concepts (words, compounds, multiwords)  Synset: synonym set (of words)  Large semantic net conecting synsets  synonymy, antonymy, hyperonymy, hyponymy  meronymy, implication, causation...  Structure  Noun hierarchy depth ~12  Verb hierarchy depth ~3  Adjective/adverb not in hierarchy, but in star structure  Freely available:  Extensively used in CompLing, but not very useful yet  Except: definitions converted to axioms and used for theorem proving in automated QA (Moldovan et al.), CLIR (Meaning)

52 AI and NLP  Synonymy  Bettween senses (or variants) in synsets  Weak notion of synonimy: from context  Synset: set of words expressing the same concept in a particular context (SemCor)  Hyperonymy / Hyponymy  Class/subclass relation  {lion} -> {feline} WordNet & EuroWordNet WordNet Semantic Relations

53 AI and NLP  Meronymy relations  Part /component  {hand}  {arm}  Element of a colectivity  {person}  {people}  Sustance  {newspaper}  {paper} WordNet & EuroWordNet WordNet Semantic Relations

54 AI and NLP  Antonymy  {big}  {small}  Causation  {kill}  {die}  Implication  {divorce}  {marry}  Derivation  {presidential}  {president}  Simililarity  {good}  {positive} WordNet & EuroWordNet WordNet Semantic Relations

55 AI and NLP WordNet & EuroWordNet WordNet example

56 AI and NLP56 Versionnounsverbsadjadvsynsetsrelations WN1.551,253 8,847 13,4603,145 76,705103,445 WN1.6 66,02512,127 17,9153,575 99,642138,741 WN1.774,48812,75418,5233,612109,377151,546 XWN74,48812,75418,5233,612109,377551,551 WN ,80413,21418,5763,629111,223153,781 WN2.079,68913,50818,5633,664115,424204,074 WordNet & EuroWordNet WordNet volumes

57 AI and NLP  Project LE  Telematics Application Programme de la UE  Integrated local wordnets in several languages  EnglishSheffield  DutchAmsterdam  ItalianPisa  SpanishUB, UPC, UNED.  Computers and the Humanities (Vossen 98)  WordNet & EuroWordNet EuroWordNet

58 AI and NLP

59 (forall (?X) (or (not (instance ?X FloweringPlant)) (not (instance ?X BodyPart)))) YES. (exists (?X) (and (instance ?X FloweringPlant) (instance ?X BodyPart))) NO. (forall (?Y) (forall (?X) (=> (equal ?X ?Y) (equal ?X ?Y)))) YES. (forall (?X) (=> (instance ?X Flower) (instance ?X Organ))) YES. (instance Ear FloweringPlant) NO. (instance Ear Organ) NO. (subclass Ear Organ) YES. (forall (?X) (=> (not (subclass ?X Organ)) (subclass ?X Flower) )) NO. (forall (?X) (=> (subclass ?X Flower) (subclass ?X Organ) )) YES. SUMO Reasoning with Sigma (Vampire)

60 AI and NLP A.1) First labelling: Conceptual Distance (Agirre et al. 94)  length of the shortest path  specificity of the concepts  using WordNet  Bilingual dictionary Merge approach: Taxonomy Construction Step 1: Selection of the main top beginners

61 AI and NLP abadía_1_2 Iglesia o monasterio regido por un abad o abadesa (abbey, a church or a monastery ruled by an abbot or an abbess) Merge approach: Taxonomy Construction Step 1: Selection of the main top beginners

62 AI and NLP 06 ARTIFACT abadía_1_2 Iglesia o monasterio regido por un abad o abadesa (abbey, a church or a monastery ruled by an abbot or an abbess) Merge approach: Taxonomy Construction Step 1: Selection of the main top beginners

63 AI and NLP A.1) First labelling (Results)  29,205 labelled definitions (31% coverage)  61% accuracy at a sense level  64% accuracy at a file level Merge approach: Taxonomy Construction Step 1: Selection of the main top beginners

64 AI and NLP A.2) Second labelling: Salient Words (Yarowsky 92) Importance  local frequency  appears more significantly more often in the corpus of a semantic category than at other points in the whole corpus Merge approach: Taxonomy Construction Step 1: Selection of the main top beginners

65 AI and NLP  A.2) Second labelling (Results):  86,759 labelled definitions (93%)  80% accuracy at a file level biberón_1_1 ARTIFACT Frasco de cristal... (glass flask...) biberón_1_2 FOOD Leche que contiene este frasco... (milk contained in that flask...) Merge approach: Taxonomy Construction Step 1: Selection of the main top beginners

66 AI and NLP C1 C2 C3 C5 C6 C4 Merge approach: Mapping Taxonomies Step 3: Creation of translation links

67 AI and NLP C1 C2 C3 C5 C6 C4 Merge approach: Mapping Taxonomies Step 3: Creation of translation links

68 AI and NLP SUMO Introduction  The Suggested Upper Merged Ontology (SUMO)  IEEE Standard Upper Ontology Working Group  An upper ontology is limited to concepts that are meta, generic, abstract, general enough to address a broad range of domain areas.  To promote:  Interoperability  Information Search and retrieval  Automated inference  NLP  Developement of Domain ontologies

69 AI and NLP MEANING: MCR0 vaso_ n06-NOUN.ARTIFACTFACTOTUM GLOSS: a glass container for holding liquids while drinking TO: 1stOrderEntity-Form-Object TO: 1stOrderEntity-Origin-Artifact TO: 1stOrderEntity-Function-Container TO: 1stOrderEntity-Function-Instrument EN: drinking_glass glass IT: bicchiere BA: edontzi baso edalontzi CA: got vas DOBJ SemCor v polish shine smooth v beautify embellish prettify v get_hold_of take v ameliorate amend v alter change

70 AI and NLP MEANING: MCR0 vaso_ n 08-NOUN.BODY ANATOMY GLOSS: a tube in which a body fluid circulates TO: 1stOrderEntity-Form-Substance-Solid TO: 1stOrderEntity-Origin-Natural-Living TO: 1stOrderEntity-Composition-Part TO: 1stOrderEntity-Function-Container EN: vessel vas IT: vaso canale BA: hodi baso CA: vas DOBJ SemCorSUBJ SemCor v be occur v stop terminate v inject shoot v flow travel_along v flow travel_along v discontinue v administer dispense v cease end finish...

71 AI and NLP MEANING: MCR0 vaso_ n 23-NOUN.QUANTITY NUMBER GLOSS: the quantity a glass will hold TO: 1stOrderEntity-Composition-Part TO: 2ndOrderEntity-SituationType-Static TO: 2ndOrderEntity-SituationComponent-Quantity EN: glassful glass IT: bicchierata bicchiere BA: basokada CA: got vas DOBJ SemCor v drink imbibe v accept have take v consume have ingest take take_in v acquire get

72 AI and NLP MEANING: MCR

73 AI and NLP MEANING: MCR

74 AI and NLP MEANING: MCR

75 AI and NLP MEANING: MCR1 vaso_ n06-NOUN.ARTIFACTFACTOTUM SUMO: &%Artifact+ LOGICAL FORMULA: glass:NN(x1) -> glass:NN(x1) container:NN(x2) for:IN(x1, e1) hold:VB(e1, x1, x3) liquid:NN(x3) while:IN(e0, e2) drink:VB(e2, x1)hold:VB(e1 PARSING: (TOP (S (NP (NN glass) ) (VP (VBZ is) (NP (NP (DT a) (NN glass) (NN container) ) (PP (IN for) (S (VP (VBG holding) (PP (NP (NNS liquids) ) (IN while) ) (VBG drinking) ) ) ) ) ) (..) ) ) WSD: a glass container for holding liquids while drinking

76 AI and NLP MEANING: MCR1 vaso_ n 08-NOUN.BODY ANATOMY SUMO: &%BodyVessel+ LOGICAL FORMULA: vessel:NN(x1) -> tube:NN(x1) in:IN(x2, x3) body_fluid:NN(x2) circulate:VB(e1, x2) PARSING: (TOP (S (NP (NN vessel) ) (VP (VBZ is) (NP (NP (DT a) (NN tube) ) (SBAR (WHPP (IN in) (WHNP (WDT which) ) ) (S (NP (DT a) (NN body) (NN fluid) ) (VP (VBZ circulates) ) ) ) ) ) (..) ) ) WSD: a tube in which a body_fluid circulates

77 AI and NLP MEANING: MCR1 vaso_ n 23-NOUN.QUANTITY NUMBER SUMO: &%ConstantQuantity+ LOGICAL FORMULA: glass:NN(x1) -> quantity:NN(x1) glass:NN(x2) hold:VB(e1, x2)hold:VB(e1 PARSING: (TOP (S (NP (NN glass) ) (VP (VP (VBZ is) (NP (DT the) (NN quantity) ) (NP (DT a) (NN glass) ) ) (VP (MD will) (VP (VB hold) ) ) ) (..) ) ) WSD: the quantity a glass will hold

78 AI and NLP MEANING: MCR and consistency checking n blow &%Breathing+ anatomy v blow &%Breathing+ medicine v exhale &%Breathing+ biology v exhale &%Breathing+ medicine a exhaled &%Breathing+ factotum a exhaling &%Breathing+ factotum n expiration &%Breathing+ anatomy a expiratory &%Breathing+ anatomy v expire &%Breathing+ medicine a inhalant &%Breathing+ anatomy n inhalation &%Breathing+ anatomy v inhale &%Breathing+ medicine a inhaled &%Breathing+ factotum a inhaling &%Breathing+ factotum n pant &%Breathing+ anatomy v pant &%Breathing+ medicine n panting &%Breathing+ anatomy a panting &%Breathing+ factotum r pantingly &%Breathing+ factotum...

79 AI and NLP MEANING: MCR and consistency checking

80 AI and NLP  Does an orchard apple tree have leaves?  Does an orchad apple tree have fruits?  Does a cactus have leaves? MEANING: MCR and consistency checking

81 Use and design of ontologies for NLP and the Semantic Web Example SUMO: Boiling  (subclass Boiling StateChange)subclassBoilingStateChange  (documentation Boiling "The Class of Processes where an Object is heated and converted from a Liquid to a Gas.")documentationBoilingClassProcessesObjectLiquidGas  (=> (instance ?BOIL Boiling) (exists (?HEAT) (and (instance ?HEAT Heating) (subProcess ?HEAT ?BOIL))))=>instanceBoilingexistsandinstanceHeatingsubProcess  "if instance BOIL Boiling, then there exists HEAT such that instance HEAT Heating and subProcess HEAT BOIL" MEANING: MCR and consistency checking

82 AI and NLP MEANING hospital_1 building_industry medicine town_planning artifact StationaryArtifact+ Artifact+ Building+ Object+ health_falicility_1 building_industry medicine town_planning artifact Building+ Artifact+ Building+ Object+ ISA hospital_1 a health facility where patients receive treatment where patient_1 medicine person patient+ Function+ Human+ Living+ Object+ receive_2 treatement_1 factotum change Getting+ Dynamic= Experience= medicine act TherapeuticProcess+ Agentive= Cause+ Condition= Dynamic= Purpose= Social= UnboundedEvent+

83 AI and NLP MEANING Frame ElementsCore Type AfflictionCore Body_partCore DegreePeripheral DurationExtra-Thematic HealerCore MannerPeripheral MedicationCore MotivationExtra-Thematic PatientCore PlacePeripheral PurposeExtra-Thematic TimePeripheral TreatmentCore FRAMENET: cure.n

84 AI and NLP MEANING hospital_1 building_industry medicine town_planning artifact StationaryArtifact+ Artifact+ Building+ Object+ health_falicility_1 building_industry medicine town_planning artifact Building+ Artifact+ Building+ Object+ ISA hospital_1 a health facility where patients receive treatment where patient_1 medicine person patient+ Function+ Human+ Living+ Object+ receive_2 treatement_1 factotum change Getting+ Dynamic= Experience= medicine act TherapeuticProcess+ Agentive= Cause+ Condition= Dynamic= Purpose= Social= UnboundedEvent+ PLACEPATIENTTREATEMENT PATIENTTREATEMENT PLACE

85 AI and NLP85 #rel. o ff set synset n group 1,grouping n social group n organisation 2,organization n establishment 2,institution n faith 3,religion n Christianity 2,church 1,Christian church n entity 1,something n ob ject 1,physical ob ject n artifact 1,artefact n construction 3,structure n building 1,edifice n place of worship 1,house of prayer 1,house of God n church 2,church building n act 2,human action 1,human activity n activity n ceremony n religious ceremony 1,religious ritual n service 3,religious service 1,divine service n church 3,church service 1 MEANING Automatic selection of Base Concepts

86 AI and NLP86 Versionnounsverbsadjadvsynsetsrelations WN(1.6, 2.0)74,48812,75418,5233,612109,377180,303 XWN550,922 Semcor203,546 BNC707,618 Total 1.642,389 MEANING MCR volumes

87 AI and NLP87

88 AI and NLP MEANING: Semantic Interpretation (Aznar) (invita) (al Dalai_Lama) (a un almuerzo oficial) invitarobjetoorigen acto destino

89 AI and NLP Conclusion  NLU: semántica y pragmática COLECCIÓN DE PREGUNTAS Y RESPUESTAS HECHAS EN JUICIOS CELEBRADOS EN ESPAÑA. Publicado en la revista del Colegio de Abogados de Madrid.  "¿Estaba usted presente cuando le tomaron la foto?"  "¿Estaba usted solo, o era el único?"  "¿Fue usted, o su hermano menor, quien murió en la guerra?"  "¿Afirma que él le mató a usted?"  "¿A qué distancia estaban los vehículos en el momento de la colisión?“  "Usted permaneció allí hasta que se marchó, ¿no es cierto?"

90 AI and NLP Conclusion  Pregunta: "Doctor, ¿cuantas autopsias ha realizado usted a personas fallecidas?"  Respuesta: "Todas mis autopsias las practiqué a personas fallecidas“  Pregunta: "Cada una de sus respuestas debe ser verbal, ¿de acuerdo?" "¿A qué escuela fue usted?"  Respuesta: "Verbal" (risas y comentarios jocosos en la sala)  Pregunta: "¿Recuerda usted a qué hora examinó el cadáver?"  Respuesta: "La autopsia comenzó alrededor de las 8:30"  Pregunta: "¿El Sr. Perez Tomadilla estaba muerto en ese momento?"  Respuesta: "No, estaba sentado en la mesa preguntándose por que estaba yo haciéndole la autopsia."  Pregunta: "¿Le dispararon en medio de la refriega?"  Respuesta: "No, me dispararon entre la refriega y el ombligo."

91 AI and NLP Conclusion  Pregunta: "Doctor, ¿antes de realizar la autopsia, verificó si había pulso?"  Respuesta: "No"  Pregunta: "¿Comprobó la presión sanguínea?"  Respuesta: "No"  Pregunta: "¿Verificó si respiraba?"  Respuesta: "No"  Pregunta: "¿Entonces, es posible que el paciente estuviera vivo cuando usted comenzó la autopsia?"  Respuesta: "No"  Pregunta: "¿Cómo puede usted estar tan seguro, Doctor?"  Respuesta: "Porque su cerebro estaba sobre mi mesa, en un tarro"  Pregunta: "¿Pero podría, no obstante, haber estado aún vivo el paciente?“  Respuesta: "Es posible que hubiera estado vivo y ejerciendo de abogado en alguna parte."


Download ppt "AI and NLP German Rigau i Claramunt IXA group Departamento de Lenguajes y Sistemas Informáticos UPV/EHU."

Similar presentations


Ads by Google