Contextual Vocabulary Acquisition: From Algorithm to Curriculum

Slides:



Advertisements
Similar presentations
Critical Reading Strategies: Overview of Research Process
Advertisements

Justification-based TMSs (JTMS) JTMS utilizes 3 types of nodes, where each node is associated with an assertion: 1.Premises. Their justifications (provided.
Semantics (Representing Meaning)
Knowledge Representation
Josh.ppt version: Artificial Intelligence, Natural Language, and the Chinese Room William J. Rapaport Department of Computer Science & Engineering,
Contextual Vocabulary Acquisition: From Algorithm to Curriculum William J. Rapaport Department of Computer Science & Engineering Department of Philosophy.
What Is the “Context” for Contextual Vocabulary Acquisition? William J. Rapaport Department of Computer Science & Engineering Department of Philosophy.
1 Contextual Vocabulary Acquisition: A Computational Theory and Educational Curriculum William J. Rapaport Department of Computer Science & Engineering.
In Defense of Contextual Vocabulary Acquisition: How to Do Things with Words in Context William J. Rapaport Department of Computer Science & Engineering,
Current Research William J. Rapaport CVA Research Group SNePS Research Group (SNeRG) Center for Cognitive Science.
LANGUAGE LEARNING STRATEGIES
Writing an “A” Paper.
Semantic Web Technologies Lecture # 2 Faculty of Computer Science, IBA.
Academic Needs of L2/Bilingual Learners
LOGIC AND ONTOLOGY Both logic and ontology are important areas of philosophy covering large, diverse, and active research projects. These two areas overlap.
Machine Learning Chapter 5. Artificial IntelligenceChapter 52 Learning 1. Rote learning rote( โรท ) n. วิถีทาง, ทางเดิน, วิธีการตามปกติ, (by rote จากความทรงจำ.
Albert Gatt LIN3021 Formal Semantics Lecture 4. In this lecture Compositionality in Natural Langauge revisited: The role of types The typed lambda calculus.
1 Contextual Vocabulary Acquisition: From Algorithm to Curriculum Michael W. Kibby, Ph.D. Department of Learning & Instruction and The Reading Center.
The SNePS Research Group Semantic Network Processing System The long-term goal of The SNePS Research Group is the design and construction of a natural-language-using.
Melissa Horn Katie Laver Jody Shaughnessy. Proficient readers use a number of different cognitive strategies in the process of interacting with texts.
1 UNIT-3 KNOWLEDGE REPRESENTATION. 2 Agents that reason logically(Logical agents) A Knowledge based Agent The Wumpus world environment Representation,
Scaffolding Cognitive Coaching Reciprocal Teaching Think-Alouds.
Artificial Intelligence Hossaini Winter Outline book : Artificial intelligence a modern Approach by Stuart Russell, Peter Norvig. A Practical Guide.
Artificial Intelligence Knowledge Representation.
Module 3 Developing Reading Skills Part 2 Transition Module 3 developed byElisabeth Wielander.
 What’s going on here?  There’s no way to know for sure what goes on in a reader’s head. And every reader probably reads a little differently. This.
Artificial Intelligence 4. Knowledge Representation
Schema Theory.
An –Najah National University Submitted to : Dr. Suzan Arafat
Knowledge Representation Techniques
Chapter 7. Propositional and Predicate Logic
Vocabulary Module 2 Activity 5.
READING 35 Minutes; 40 Questions; 4 Passages
Paraphrasing Class #8 February 14, 2013.
Using Cognitive Science To Inform Instructional Design
ICT : Module III - Instructional Design Mrs. Sunita Singh
Lecture #1 Introduction
William J. Rapaport Department of Computer Science & Engineering
"Developing reading skills: essential reading comprehension skills, reading for the main idea, determining meaning from the context, tips for vocabulary.
Message Time Plus (MTP)
THE QUESTIONS—SKILLS ANALYSE EVALUATE INFER UNDERSTAND SUMMARISE
ENGLISH TEST 45 Minutes – 75 Questions
Knowledge Representation
Lesson 6: Focus
CS 240 – Lecture 11 Pseudocode.
Important ideas to help you survive
Engleski jezik struke 3 Sreda,
CSc4730/6730 Scientific Visualization
Artificial Intelligence Lecture 2: Foundation of Artificial Intelligence By: Nur Uddin, Ph.D.
Reading Strategies “The only guide you'll ever need to Reading Chinese,” accessed at Zizzle Learn Chinese
KNOWLEDGE REPRESENTATION
Scholastic Aptitude Test Developing Critical Reading Skills
Thinking About How You Read
National Curriculum Requirements of Language at Key Stage 2 only
Reading Comprehension Rocks!
Section VI: Comprehension
Chapter 7. Propositional and Predicate Logic
Knowledge Representation
Asking the Right Questions
Reading Strategies and Techniques
Implementation of Learning Systems
Information Retrieval
ACT Reading Test You will read 4 passages and answer 40 questions in 35 minutes. You have approximately 9 minutes per passage.
READING ACT STRATEGIES Strategy 1: Know The Test
Representations & Reasoning Systems (RRS) (2.2)
ACT Reading Test You will read 4 passages and answer 40 questions in 35 minutes. You have approximately 9 minutes per passage.
SUU Presents: ACT Prep.
Using Phonemic Awareness &
Habib Ullah qamar Mscs(se)
But I’m Not a Reading Teacher
Presentation transcript:

Contextual Vocabulary Acquisition: From Algorithm to Curriculum Michael W. Kibby Department of Learning & Instruction and The Reading Center William J. Rapaport Department of Computer Science & Engineering Department of Philosophy, and Center for Cognitive Science Karen M. Wieland Department of Learning & Instruction , The Reading Center, and The Nichols School NSF ROLE Grant REC-0106338 1

Definition of “CVA” the acquisition of word meanings from text “Contextual Vocabulary Acquisition” =def the acquisition of word meanings from text “incidental” “deliberate” by reasoning about contextual clues background knowledge (linguistic, factual, commonsense) Including hypotheses from prior encounters (if any) with the word without external sources of help No dictionaries No people 47

CVA: From Algorithm to Curriculum Computational theory of CVA Based on: algorithms developed by Karen Ehrlich (1995) verbal protocols (case studies) Implemented in a semantic-network-based knowledge-representation & reasoning system SNePS (Stuart C. Shapiro & colleagues) Educational curriculum to teach CVA Based on our algorithms & protocols To improve vocabulary & reading comprehension Joint work with Michael Kibby & Karen Wieland Center for Literacy & Reading Instruction 48

People Do “Incidental” CVA We know more words than explicitly taught Average high-school grad knows ~45K words  learned ~2.5K words/year (over 18 yrs.) But only taught ~400/school-year ~ 4800 in 12 years of school (~ 10% of total) Most word meanings learned from context − including oral & perceptual contexts “incidentally” (unconsciously) How? 50

People Also Do “Deliberate” CVA You’re reading; You understand everything you read, until… You come across a new word Not in dictionary No one to ask So, you try to “figure out” its meaning from “context” How? guess? derive? infer? deduce? educe? construct? predict? … our answer: Compute it from inferential search of “context”, including background knowledge 51

What does ‘brachet’ mean? Move into small groups: If <= 20, then divide into 3 groups, facilitated by Bill, Michael, & Karen If > 20, then: save 5 seats in front; choose 5 people at random for demo 52

(From Malory’s Morte D’Arthur [page # in brackets]) 1. There came a white hart running into the hall with a white brachet next to him, and thirty couples of black hounds came running after them. [66] As the hart went by the sideboard, the white brachet bit him. [66] The knight arose, took up the brachet and rode away with the brachet. [66] A lady came in and cried aloud to King Arthur, “Sire, the brachet is mine”. [66] There was the white brachet which bayed at him fast. [72] 18. The hart lay dead; a brachet was biting on his throat, and other hounds came behind. [86] Predefine ‘hart’, ‘sideboard’, ‘bay’. Sentence 3: if no one suggests that brachet is small, ask how big it is. 53

Computational cognitive theory of how to learn word meanings From context I.e., text + grammatical info + reader’s prior knowledge With no external sources (human, on-line) Unavailable, incomplete, or misleading Domain-independent But more prior domain-knowledge yields better definitions “definition” = hypothesis about word’s meaning Revisable each time word is seen 54

Cassie learns what “brachet” means: Background info about: Cassie learns what “brachet” means: Background info about: harts, animals, King Arthur, etc. No info about: brachets Input: formal-language (SNePS) version of simplified English A hart runs into King Arthur’s hall. • In the story, B12 is a hart. • In the story, B13 is a hall. • In the story, B13 is King Arthur’s. • In the story, B12 runs into B13. A white brachet is next to the hart. • In the story, B14 is a brachet. • In the story, B14 has the property “white”. • Therefore, brachets are physical objects. (deduced while reading; Cassie believes that only physical objects have color) 55

--> (defineNoun "brachet") Definition of brachet: Class Inclusions: phys obj, Possible Properties: white, Possibly Similar Items: animal, mammal, deer, horse, pony, dog, I.e., a brachet is a physical object that can be white and that might be like an animal, mammal, deer, horse, pony, or dog 56

A hart runs into King Arthur’s hall A hart runs into King Arthur’s hall. A white brachet is next to the hart. The brachet bites the hart’s buttock. --> (defineNoun "brachet") Definition of brachet: Class Inclusions: animal, Possible Actions: bite buttock, Possible Properties: white, Possibly Similar Items: mammal, pony, 57

A hart runs into King Arthur’s hall. A white brachet is next to the hart. The brachet bites the hart’s buttock. The knight picks up the brachet. The knight carries the brachet. --> (defineNoun "brachet") Definition of brachet: Class Inclusions: animal, Possible Actions: bite buttock, Possible Properties: small, white, Possibly Similar Items: mammal, pony, 58

A hart runs into King Arthur’s hall A hart runs into King Arthur’s hall. A white brachet is next to the hart. The brachet bites the hart’s buttock. The knight picks up the brachet. The knight carries the brachet. The lady says that she wants the brachet. --> (defineNoun "brachet") Definition of brachet: Class Inclusions: animal, Possible Actions: bite buttock, Possible Properties: valuable, small, white, Possibly Similar Items: mammal, pony, 59

--> (defineNoun "brachet") Definition of brachet: A hart runs into King Arthur’s hall. A white brachet is next to the hart. The brachet bites the hart’s buttock. The knight picks up the brachet. The knight carries the brachet. The lady says that she wants the brachet. The brachet bays at Sir Tor. [background knowledge: only hunting dogs bay] --> (defineNoun "brachet") Definition of brachet: Class Inclusions: hound, dog, Possible Actions: bite buttock, bay, hunt, Possible Properties: valuable, small, white, I.e. A brachet is a hound (a kind of dog) that can bite, bay, and hunt, and that may be valuable, small, and white. 60

General Comments System’s behavior  human protocols System’s definition  OED’s definition: = A brachet is “a kind of hound which hunts by scent” 61

3 kinds of vocabulary acquisition: Computational cognitive theory of how to learn word meanings from context (cont.) 3 kinds of vocabulary acquisition: Construct new definition of unknown word What does ‘brachet’ mean? Fully revise definition of misunderstood word Does “smiting” entail killing? Expand definition of word used in new sense Can you “dress” (i.e., clothe) a spear? Initial hypothesis; Revision(s) upon further encounter(s); Converges to stable, dictionary-like definition; Subject to revision 62

State of the Art: Computational Linguistics Information extraction systems Autonomous intelligent agents There can be no complete lexicon Such systems/agents shouldn’t have to stop to ask questions 65

State of the Art: Computational Linguistics Granger 1977: “Foul-Up” Based on Schank’s theory of “scripts” (schema theory) Our system not restricted to scripts Zernik 1987: self-extending phrasal lexicon Uses human informant Ours system is really “self-extending” Hastings 1994: “Camille” Maps unknown word to known concept in ontology Our system can learn new concepts Word-Sense Disambiguation: Given ambiguous word & list of all meanings, determine the “correct” meaning Multiple-choice test  Our system: given new word, compute its meaning Essay question  Ellen Prince 66

State of the Art: Vocabulary Learning (I) Elshout-Mohr/van Daalen-Kapteijns 1981,1987: Application of Winston’s AI “arch” learning theory (Good) reader’s model of new word = frame Attribute slots, default values Revision by updating slots & values Poor readers update by replacing entire frames But EM & vDK used: Made-up words Carefully constructed contexts Presented in a specific order 67

Elshout-Mohr & van Daalen-Kapteijns Experiments with neologisms in 5 artificial contexts When you are used to a view it is depressing when you live in a room with kolpers. Superordinate information At home he had to work by artificial light because of those kolpers. During a heat wave, people want kolpers, so sun-blind sales increase. Contexts showing 2 differences from the superordinate I was afraid the room might have kolpers, but plenty of sunlight came into it. This house has kolpers all summer until the leaves fall out. Contexts showing 2 counterexamples due to the 2 differences 68

State of the Art: Psychology Johnson-Laird 1987: Word understanding  definition Definitions aren’t stored “During the Renaissance, Bernini cast a bronze of a mastiff eating truffles.” 69

State of the Art: Psychology Sternberg et al. 1983,1987: Cues to look for (= slots for frame): Spatiotemporal cues Value cues Properties Functions Cause/enablement information Class memberships Synonyms/antonyms To acquire new words from context: Distinguish relevant/irrelevant information Selectively combine relevant information Compare this information with previous beliefs 70

Sternberg The couple there on the blind date was not enjoying the festivities in the least. An acapnotic, he disliked her smoking; and when he removed his hat, she, who preferred “ageless” men, eyed his increasing phalacrosis and grimaced. 71

State of the Art: Vocabulary Learning (II) Some dubious contributions: Mueser 1984: “Practicing Vocabulary in Context” BUT: “context” = definition !! Clarke & Nation 1980: a “strategy” (algorithm?) Look at word & context; determine POS Look at grammatical context E.g., “who does what to whom”? Look at wider context [E.g., search for Sternberg-like clues] Guess the word; check your guess 72

CVA: From Algorithm to Curriculum “guess the word” = “then a miracle occurs” Surely, we computer scientists can “be more explicit”! 73

CVA: From algorithm to curriculum … Treat “guess” as a procedure call (“subroutine”) Fill in the details with our algorithm Convert the algorithm into a curriculum To enhance students’ abilities to use deliberate CVA strategies To improve reading comprehension … and back again! Use knowledge gained from CVA case studies to improve the algorithm I.e., use Cassie to learn how to teach humans & use humans to learn how to teach Cassie 74

Why not use a dictionary? Because: People are lazy (!) Dictionaries are not always available Dictionaries are always incomplete Dictionary definitions are not always useful ‘chaste’ =df clean, spotless / “new dishes are chaste” ‘college’ =df a body of clergy living together and supported by a foundation Most words are learned via incidental CVA, not via dictionaries Most importantly: Dictionary definitions are just more contexts! 75

How Does Our System Work? Uses a semantic network computer system semantic networks = “concept maps” serves as a model of the reader represents: reader’s prior knowledge the text being read can reason about the text and the reader’s knowledge 82

Fragment of reader’s prior knowledge: m3 = In “real life”, white is a color m6 = In “real life”, harts are deer m8 = In “real life”, deer are mammals m11 = In “real life”, halls are buildings m12 = In “real life”, b1 is named “King Arthur” m14 = In “real life”, b1 is a king (etc.) 83

then v1 is a kind of physical object m16 = if v3 has property v2 & if v2 is a color & if v3  v1 then v1 is a kind of physical object 84

m17 = In the story, b2 is a hart Reading the story: m17 = In the story, b2 is a hart m24 = In the story, the hart runs into b3 (b3 is King Arthur’s hall) – not shown (harts are deer) – not shown 85

The entire network showing the reader’s mental context consisting of prior knowledge, the story, & inferences. The definition algorithm searches this network & abstracts parts of it to produce a (preliminary) definition of ‘brachet’. 86

Implementation SNePS (Stuart C. Shapiro & SNeRG): Intensional, propositional semantic-network knowledge-representation & reasoning system Node-based & path-based reasoning I.e., logical inference & generalized inheritance SNeBR belief revision system Used for revision of definitions SNaLPS natural-language input/output “Cassie”: computational cognitive agent 87

How It Works SNePS represents: background knowledge + text information in a single, consolidated semantic network Algorithms deductively search network for slot-fillers for definition frame Search is guided by desired slots E.g., prefers general info over particular info, but takes what it can get 88

Noun Algorithm Find or infer: Basic-level class memberships (e.g., “dog”, rather than “animal”) else most-specific-level class memberships else names of individuals Properties of Ns (else, of individual Ns) Structure of Ns (else …) Functions of Ns (else …) Acts that Ns perform (else …) Agents that perform acts w.r.t. Ns & the acts they perform (else…) Ownership Synonyms Else do: “syntactic/algebraic manipulation” “Al broke a vase”  a vase is something Al broke Or: a vase is a breakable physical object 89

Verb Algorithm Find or infer: Future work: Predicate structure: Categorize arguments/cases Results of V’ing: Effects, state changes Enabling conditions for V Future work: Classification of verb-type Synonyms [Also: preliminary work on adjective algorithm] 90

Belief Revision Used to revise definitions of words with different sense from current meaning hypothesis SNeBR (ATMS; Martins & Shapiro 88): If inference leads to a contradiction, then: SNeBR asks user to remove culprit(s) & automatically removes consequences inferred from culprit SNePSwD (SNePS w/ Defaults; Martins & Cravo 91) Currently used to automate step 1, above AutoBR (Johnson & Shapiro, in progress) & new default reasoner (Bhushan & Shapiro, in progress) Will replace SNePSwD OMIT IF NO TIME 91

Revision & Expansion Removal & revision being automated via SNePSwD by ranking all propositions with kn_cat: most intrinsic info re: language; fundamental background info certain (“before” is transitive) story info in text (“King Lot rode to town”) life background info w/o variables or inference (“dogs are animals”) story-comp info inferred from text (King Lot is a king, rode on a horse) life-rule.1 everyday commonsense background info (BearsLiveYoung(x)  Mammal(x)) life-rule.2 specialized background info (x smites y  x kills y by hitting y) least certain questionable already-revised life-rule.2; not part of input OMIT IF NO TIME 92

Belief Revision: “smite” Misunderstood word; 2-stage “subtractive” revision Background knowledge includes: (*) smite(x,y,t)  hit(x,y,t) & dead(y,t) & cause(hit(x,y,t),dead(y,t)) P1: King Lot smote down King Arthur D1: If person x smites person y at time t, then x hits y at t, and y is dead at t Q1: What properties does King Arthur have? R1: King Arthur is dead. P2: King Arthur drew Excalibur. Q2: When did King Arthur do this? SNeBR is invoked: KA’s drawing E is inconsistent with being dead (*) replaced: smite(x,y,t)  hit(x,y,t) & dead(y,t) & [dead(y,t)  cause(hit, dead)] D2: If person x smites person y at time t, then x hits y at t & (y is dead at t) P3: [another passage in which ~(smiting  death)] D3: If person x smites person y at time t, then x hits y at t 93

Belief Revision: “dress” “additive” revision Bkgd info includes: dresses(x,y)  z[clothing(z) & wears(y,z) Spears don’t wear clothing (both kn_cat=life.rule.1) P1: King Arthur dressed himself. D1: A person can dress itself; result: it wears clothing. P2: King Claudius dressed his spear. [Cassie infers: King Claudius’s spear wears clothing.] Q2: What wears clothing? SNeBR is invoked: KC’s spear wears clothing inconsistent with (2). (1) replaced: dresses(x,y)  z[clothing(z) & wears(y,z)] v NEWDEF Replace (1), not (2), because of verb in antecedent of (1) (Gentner) P3: [other passages in which dressing spears precedes fighting] D2: A person can dress a spear or a person; result: person wears clothing or person is enabled to fight 94

Figure out meaning of word from what? context (i.e., the text)? Werner & Kaplan 52, McKeown 85, Schatz & Baldwin 86 context and reader’s background knowledge? Granger 77, Sternberg 83, Hastings 94 context including background knowledge? Nation & Coady 88, Graesser & Bower 90 Note: “context” = text  context is external to reader’s mind Could also be spoken/visual/situative (still external) “background knowledge”: internal to reader’s mind What is (or should be) the “context” for CVA? 95

Some Proposed Preliminary Definitions (to extract order out of confusion) Unknown word for a reader =def Word or phrase that reader has never seen before Or only has vague idea of its meaning Different levels of knowing meaning of word Notation: “X” 96

Proposed preliminary definitions Text =def (written) passage containing X single phrase or sentence … several paragraphs 97

Proposed preliminary definitions Co-text of X in some text =def The entire text “minus” X; i.e., entire text surrounding X E.g., if X = ‘brachet’, and text = “There came a white hart running into the hall with a white brachet next to him, and thirty couples of black hounds came running after them.” Then X’s co-text in this text = “There came a white hart running into the hall with a white ______ next to him, and thirty couples of black hounds came running after them.” Cf. “cloze” tests in psychology But, in CVA, reader seeks meaning or definition NOT a missing word or synonym: There’s no “correct” answer! “Co-text” is what many mean by “context” BUT: they shouldn’t! 98

Proposed preliminary definitions The reader’s prior knowledge =def the knowledge that the reader has when s/he begins to read the text and is able to recall as needed while reading “knight picks up & carries brachet” ? small Warnings: “knowledge”  truth so, “prior beliefs” is better “prior” vs. “background” vs. “world”, etc. See next slide! 99

Proposed preliminary definitions Possible synonyms for “prior knowledge”, each with different connotation: Background knowledge: Can use for information that author assumes reader to have World knowledge: General factual knowledge about things other than the text’s topic Domain knowledge: Specialized, subject-specific knowledge about the text’s topic Commonsense knowledge: Knowledge “everyone” has E.g., CYC, “cultural literacy” (Hirsch) These overlap: PK should include some CSK, might include some DK BK might include much DK 100

Steps towards a Proper Definition of “Context” The context of X for a reader =def The co-text of X “+” the reader’s prior knowledge Both are needed! After reading: “the white brachet bit the hart in the buttock” most subjects infer that brachets are (probably) animals, from: That text, plus: Available PK premise: “If x bites y, then x is (probably) an animal. Inference is not an enthymeme! (argument with missing premise) 101

Proper definition of “context”: (inference not an enthymeme because): When you read, you “internalize” the text You “bring it into” your mind Gärdenfors 1997, 1999; Jackendoff 2002 “Missing” premise might be in reader’s mind! This “internalized text” is more important than the actual words on paper: Text: “I’m going to put the cat out” Misread as: “I’m going to put the car out” leads to different understanding of “the text” What matters is what the reader thinks the text is, Not what the text actually is Therefore … 102

Proper definition of “context”: Step 2: The context of X for a reader =def A single KB, consisting of: The reader’s internalized co-text of X “+” the reader’s prior knowledge 103

Proper definition of “context”: But: What is “+”? Not: mere conjunction or union! Active readers make inferences while reading. From text = “a white brachet” & prior commonsense knowledge = “only physical objects have color”, reader might infer that brachets are physical objects From “The knight took up the brachet and rode away with the brachet.” & prior commonsense knowledge about size, reader might infer that brachet is small enough to be carried Whole > sum of parts: inference from [internalized text + PK]  new info not in text or in PK I.e., you can learn from reading! 104

Proper definition of “context”: But: Whole < sum of parts! Reader can learn that some prior beliefs were mistaken Or: reader can decide that text is mistaken (less likely) Reading & CVA need belief revision! operation “+”: input: PK & internalized co-text output: “belief-revised integration” of input, via: Expansion: addition of new beliefs from ICT into PK, plus new inferences Revision: retraction of inconsistent prior beliefs together with inferences from them Consolidation: eliminate further inconsistencies 105

Prior Knowledge Text PK1 PK2 PK3 PK4 106

Prior Knowledge Text T1 PK1 PK2 PK3 PK4 107

Integrated KB Text T1 PK1 PK2 PK3 PK4 internalization I(T1) 108

B-R Integrated KB Text T1 PK1 PK2 PK3 PK4 I(T1) P5 inference internalization T1 PK1 PK2 PK3 PK4 I(T1) inference P5 109

B-R Integrated KB Text T1 PK1 PK2 PK3 PK4 I(T1) T2 P5 I(T2) P6 internalization I(T1) T2 inference P5 I(T2) P6 110

B-R Integrated KB Text PK1 PK2 PK3 PK4 T1 I(T1) T2 T3 P5 I(T2) P6 internalization T1 I(T1) T2 inference T3 P5 I(T2) P6 I(T3) 111

B-R Integrated KB Text PK1 PK2 PK3 PK4 T1 I(T1) T2 T3 P5 I(T2) P6 internalization T1 I(T1) T2 inference T3 P5 I(T2) P6 I(T3) 112

Note: All “contextual” reasoning is done in this “context”: B-R Integrated KB Text T1 PK1 PK2 PK3 PK4 internalization P7 I(T1) T2 inference T3 P5 I(T2) P6 I(T3) 113

Proper definition of “context”: The context that reader R should use to hypothesize a meaning for R’s internalization of unknown word X as it occurs in text T =def The belief-revised integration of R’s prior knowledge with R’s internalization of the co-text T–X. 114

This definition agrees with… Cognitive-science & reading-theoretic views of text understanding Schank 1982, Rumelhart 1985, etc. & AI techniques for text understanding: Reader’s mind modeled by KB of prior knowledge Expressed in AI language (for us: SNePS) Computational cognitive agent reads the text, “integrating” text info into its KB, and making inferences & performing belief revision along the way When asked to define a word, Agent deductively searches this single, integrated KB for information to fill slots of a definition frame Agent’s “context” for CVA = this single, integrated KB 116

Some Open Questions Roles of spoken/visual/situative contexts Relation of CVA “context” to formal theories of context (e.g., McCarthy, Guha…) Relation of I(T) to prior-KB; e.g.: Is I(Ti) true in prior-KB? It is “accepted pro tem”. Is I(T) a “subcontext” of pKB or B-R KB? How to “activate” relevant prior knowledge. Etc. 118

Problem in Converting Algorithm into Curriculum “A knight picks up a brachet and carries it away …” Cassie: Has “perfect memory” Is “perfect reasoner” Automatically infers that brachet is small People don’t always realize this: May need prompting: How big is the brachet? May need relevant background knowledge May need help in drawing inferences Teaching CVA =? teaching general reading comprehension Vocabulary knowledge correlates with reading comprehension 119

Research Methodology AI team: Reading team: Develop, implement, & test better computational theories of CVA Translate into English for use by reading team Reading team: Convert algorithms to curriculum Think-aloud protocols To gather new data for use by AI team As curricular technique (case studies) 120

Web Page http://www.cse.buffalo.edu/~rapaport/cva.html 126