For Friday Read chapter 22 Program 4 due. Program 4 Any questions?

Slides:



Advertisements
Similar presentations
Computational language: week 10 Lexical Knowledge Representation concluded Syntax-based computational language Sentence structure: syntax Context free.
Advertisements

Natural Language Processing Lecture 2: Semantics.
Natural Language Understanding Difficulties: Large amount of human knowledge assumed – Context is key. Language is pattern-based. Patterns can restrict.
Semantics (Representing Meaning)
For Monday Read Chapter 23, sections 3-4 Homework –Chapter 23, exercises 1, 6, 14, 19 –Do them in order. Do NOT read ahead.
Cognitive Processes PSY 334 Chapter 11 – Language Structure.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 21 Jim Martin.
Relational Data Mining in Finance Haonan Zhang CFWin /04/2003.
Visual Web Information Extraction With Lixto Robert Baumgartner Sergio Flesca Georg Gottlob.
PRAGMATICS. 3- Pragmatics is the study of how more gets communicated than is said. It explores how a great deal of what is unsaid is recognized. 4.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 22 Jim Martin.
1 Chapter 3 Context-Free Grammars and Parsing. 2 Parsing: Syntax Analysis decides which part of the incoming token stream should be grouped together.
Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2005 Lecture 1 21 July 2005.
9/8/20151 Natural Language Processing Lecture Notes 1.
For Wednesday Read chapter 23, sections 1-2 Homework: –Chapter 22, exercises 1, 8, 14.
Lecture 12: 22/6/1435 Natural language processing Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
For Monday Read chapter 23, sections 1-2 FOIL exercise due.
For Friday Finish chapter 23 Homework: –Chapter 22, exercise 9.
For Wednesday Finish chapter 22 No homework. Program 4 Any questions?
1 Natural Language Processing Gholamreza Ghassem-Sani Fall 1383.
CSI 3120, Grammars, page 1 Language description methods Major topics in this part of the course: –Syntax and semantics –Grammars –Axiomatic semantics (next.
Theory Revision Chris Murphy. The Problem Sometimes we: – Have theories for existing data that do not match new data – Do not want to repeat learning.
Learning to Transform Natural to Formal Language Presented by Ping Zhang Rohit J. Kate, Yuk Wah Wong, and Raymond J. Mooney.
Chapter 15 Natural Language Processing (cont)
Introduction to Linguistics Ms. Suha Jawabreh Lecture 18.
Natural Language Processing Artificial Intelligence CMSC February 28, 2002.
ISBN Chapter 3 Describing Semantics -Attribute Grammars -Dynamic Semantics.
For Monday Read Chapter 22, sections 4-6 No written homework.
1 Machine Learning: Rule Learning. 2 Learning Rules If-then rules in logic are a standard representation of knowledge that have proven useful in expert-systems.
TextBook Concepts of Programming Languages, Robert W. Sebesta, (10th edition), Addison-Wesley Publishing Company CSCI18 - Concepts of Programming languages.
For Wednesday No reading No homework. Exam 2 Friday. Will cover material through chapter 18. Take home is due Friday.
Levels of Language 6 Levels of Language. Levels of Language Aspect of language are often referred to as 'language levels'. To look carefully at language.
For Wednesday Finish Chapter 22 Program 4 due. Program 4 Any questions?
Artificial Intelligence: Natural Language
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
INTRODUCTION TO PRAGMATICS the study of language use the study of linguistic phenomena from the point of view of their usage properties and processes (Verschueren,
LECTURE 2: SEMANTICS IN LINGUISTICS
For Wednesday Read chapter 22, sections 4-6 Homework: –Chapter 18, exercise 7.
CSE573 Autumn /20/98 Planning/Language Administrative –PS3 due 2/23 –Midterms back today –Next topic: Natural Language Processing reading Chapter.
ISBN Chapter 3 Describing Semantics.
Artificial Intelligence: Natural Language
PARSING 2 David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
For Monday Finish chapter 19 No homework. Program 4 Any questions?
For Wednesday Read 20.4 Lots of interesting stuff in chapter 20, but we don’t have time to cover it all.
For Monday Finish chapter 19 Take-home exam due. Program 4 Any questions?
CSE573 Autumn /23/98 Natural Language Processing Administrative –PS3 due today –PS4 out Wednesday, due Friday 3/13 (last day of class) special.
Learning to Share Meaning in a Multi-Agent System (Part I) Ganesh Padmanabhan.
CS 5751 Machine Learning Chapter 10 Learning Sets of Rules1 Learning Sets of Rules Sequential covering algorithms FOIL Induction as the inverse of deduction.
For Wednesday Read ch. 20, sections 1, 2, 5, and 7 No homework.
Chapter 22 - Communication April 8, – Semantic Interpretation Uses First Order Logic as the representation language Compositional Semantics.
For Friday No reading Program 4 due. Program 4 Any questions?
CS460/IT632 Natural Language Processing/Language Technology for the Web Lecture 1 (03/01/06) Prof. Pushpak Bhattacharyya IIT Bombay Introduction to Natural.
1 First order theories (Chapter 1, Sections 1.4 – 1.5) From the slides for the book “Decision procedures” by D.Kroening and O.Strichman.
Natural Language Processing Slides adapted from Pedro Domingos
Natural Language Processing (NLP)
EEL 5937 Content languages EEL 5937 Multi Agent Systems Lecture 10, Feb. 6, 2003 Lotzi Bölöni.
Chap. 10 Learning Sets of Rules 박성배 서울대학교 컴퓨터공학과.
PARSING David Kauchak CS159 – Fall Admin Assignment 3 Quiz #1  High: 36  Average: 33 (92%)  Median: 33.5 (93%)
MENTAL GRAMMAR Language and mind. First half of 20 th cent. – What the main goal of linguistics should be? Behaviorism – Bloomfield: goal of linguistics.
Lecturer : Ms. Abrar Mujaddidi S YNTAX. I NTRODUCTION  In the previous chapter, we moved from the general categories and concepts of traditional grammar,
Semantics (Representing Meaning)
SYNTAX.
Language, Logic, and Meaning
Rule Learning Hankui Zhuo April 28, 2018.
Automated Reasoning in Propositional Logic
Artificial Intelligence 2004 Speech & Natural Language Processing
Natural Language Processing (NLP) Chapter One Introduction to Natural Language Processing(NLP)
Presentation transcript:

For Friday Read chapter 22 Program 4 due

Program 4 Any questions?

Learning mini-project Worth 2 homeworks Due Monday Foil6 is available in /home/mecalif/public/itk340/foil A manual and sample data files are there as well. Create a data file that will allow FOIL to learn rules for a sister/2 relation from background relations of parent/2, male/1, and female/1. You can look in the prolog folder of my 327 folder for sample data if you like. Electronically submit your data file—which should be named sister.d, and turn in a hard copy of the rules FOIL learns.

Strategies for Learning a Single Rule Top­Down (General to Specific): –Start with the most general (empty) rule. –Repeatedly add feature constraints that eliminate negatives while retaining positives. –Stop when only positives are covered. Bottom­Up (Specific to General): –Start with a most specific rule (complete description of a single instance). –Repeatedly eliminate feature constraints in order to cover more positive examples. –Stop when further generalization results in covering negatives.

FOIL Basic top­down sequential covering algorithm adapted for Prolog clauses. Background provided extensionally. Initialize clause for target predicate P to P(X 1,...X r ) :­. Possible specializations of a clause include adding all possible literals: –Q i (V 1,...V r ) –not(Q i (V 1,...V r )) –X i = X j –not(X i = X ) where X's are variables in the existing clause, at least one of V 1,...V r is an existing variable, others can be new. Allow recursive literals if not cause infinite regress.

Foil Input Data Consider example of finding a path in a directed acyclic graph. Intended Clause: path(X,Y) :­ edge(X,Y). path(X,Y) :­ edge(X,Z), path (Z,Y). Examples edge: {,,,,, } path: {,,,,,,,,, } Negative examples of the target predicate can be provided directly or indirectly produced using a closed world assumption. Every pair not in positive tuples for path.

Example Induction + : {,,,,,,,,, } - : {,,,,,,,,,,,,,,,, } Start with empty rule: path(X,Y) :­. Among others, consider adding literal edge(X,Y) (also consider edge(Y,X), edge(X,Z), edge(Z,X), path(Y,X), path(X,Z), path(Z,X), X=Y, and negations) 6 positive tuples and NO negative tuples covered. Create “base case” and remove covered examples: path(X,Y) :­ edge(X,Y).

+ : {,,, } - : {,,,,,,,,,,,,,,,,, } Start with new empty rule: path(X,Y) :­. Consider literal edge(X,Z) (among others...) 4 remaining positives satisfy it but so do 10 of 20 negatives Current rule: path(x,y) :­ edge(X,Z). Consider literal path(Z,Y) (as well as edge(X,Y), edge(Y,Z), edge(X,Z), path(Z,X), etc....) No negatives covered, complete clause. path(X,Y) :­ edge(X,Z), path(Z,Y). New clause actually covers all remaining positive tuples of path, so definition is complete.

Picking the Best Literal Based on information gain (similar to ID3). |p|*(log 2 (|p| /(|p|+|n|)) - log 2 (|P| /(|P|+|N|))) P is number of positives before adding literal L N is number of negatives before adding literal L p is number of positives after adding literal L n is number of negatives after adding literal L Given n predicates of arity m there are O(n2 m ) possible literals to chose from, so branching factor can be quite large.

Other Approaches Golem CHILL Foidl Bufoidl

Domains Any kind of concept learning where background knowledge is useful. Natural Language Processing Planning Chemistry and biology –DNA –Protein structure

Natural Language Processing What’s the goal?

Communication Communication for the speaker: –Intention: Decided why, when, and what information should be transmitted. May require planning and reasoning about agents' goals and beliefs. –Generation: Translating the information to be communicated into a string of words. –Synthesis: Output of string in desired modality, e.g.text on a screen or speech.

Communication (cont.) Communication for the hearer: –Perception: Mapping input modality to a string of words, e.g. optical character recognition or speech recognition. –Analysis: Determining the information content of the string. Syntactic interpretation (parsing): Find correct parse tree showing the phrase structure Semantic interpretation: Extract (literal) meaning of the string in some representation, e.g. FOPC. Pragmatic interpretation: Consider effect of overall context on the meaning of the sentence –Incorporation: Decide whether or not to believe the content of the string and add it to the KB.

Ambiguity Natural language sentences are highly ambiguous and must be disambiguated. I saw the man on the hill with the telescope. I saw the Grand Canyon flying to LA. I saw a jet flying to LA. Time flies like an arrow. Horse flies like a sugar cube. Time runners like a coach. Time cars like a Porsche.

Syntax Syntax concerns the proper ordering of words and its effect on meaning. The dog bit the boy. The boy bit the dog. * Bit boy the dog the Colorless green ideas sleep furiously.

Semantics Semantics concerns of meaning of words, phrases, and sentences. Generally restricted to “literal meaning” –“plant” as a photosynthetic organism –“plant” as a manufacturing facility –“plant” as the act of sowing

Pragmatics Pragmatics concerns the overall commuinicative and social context and its effect on interpretation. –Can you pass the salt? –Passerby: Does your dog bite? Clouseau: No. Passerby: (pets dog) Chomp! I thought you said your dog didn't bite!! Clouseau:That, sir, is not my dog!

Modular Processing acoustic/ phonetic syntaxsemanticspragmatics Speech recognition Parsing Sound waves wordsParse trees literal meaning meaning

Examples Phonetics “grey twine” vs. “great wine” “youth in Asia” vs. “euthanasia” “yawanna” ­> “do you want to” Syntax I ate spaghetti with a fork. I ate spaghetti with meatballs.

More Examples Semantics I put the plant in the window. Ford put the plant in Mexico. The dog is in the pen. The ink is in the pen. Pragmatics The ham sandwich wants another beer. John thinks vanilla.