Transition Network Grammars for Natural Language Analysis - W. A. Woods In-Su Yoon Pusan National University School of Electrical and Computer Engineering.

Slides:



Advertisements
Similar presentations
Prolog programming....Dr.Yasser Nada. Chapter 8 Parsing in Prolog Taif University Fall 2010 Dr. Yasser Ahmed nada prolog programming....Dr.Yasser Nada.
Advertisements

 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Augmented Transition Networks
CSA4050: Advanced Topics in NLP Semantics IV Partial Execution Proper Noun Adjective.
Syntactic analysis using Context Free Grammars. Analysis of language Morphological analysis – Chairs, Part Of Speech (POS) tagging – The/DT man/NN left/VBD.
Grammars, Languages and Parse Trees. Language Let V be an alphabet or vocabulary V* is set of all strings over V A language L is a subset of V*, i.e.,
Natural Language Understanding Understanding NL (infinite language) means determining the meaning of the sentence with respect to contest in which it is.
1 CS 385 Fall 2006 Chapter 14 Understanding Natural Language Problems.
PSY 369: Psycholinguistics Some basic linguistic theory part2.
GRAMMAR & PARSING (Syntactic Analysis) NLP- WEEK 4.
Understanding Natural Language
Parsing: Features & ATN & Prolog By
Amirkabir University of Technology Computer Engineering Faculty AILAB Efficient Parsing Ahmad Abdollahzadeh Barfouroush Aban 1381 Natural Language Processing.
1 Chapter 20 Understanding Language. 2 Chapter 20 Contents (1) l Natural Language Processing l Morphologic Analysis l BNF l Rewrite Rules l Regular Languages.
Amirkabir University of Technology Computer Engineering Faculty AILAB Parsing Ahmad Abdollahzadeh Barfouroush Aban 1381 Natural Language Processing Course,
1 Understanding Natural Language The Natural Language Understanding Problem 14.1Deconstructing Language: A Symbolic Analysis 14.2Syntax 14.3Syntax.
Natural Language Query Interface Mostafa Karkache & Bryce Wenninger.
Natural Language Processing
Syntax 3: Back to State Networks... Recursive Transition Networks John Barnden School of Computer Science University of Birmingham Natural Language Processing.
Context-Free Grammar CSCI-GA.2590 – Lecture 3 Ralph Grishman NYU.
The syntax of language How do we form sentences? Processing syntax. Language and the brain.
Syntax Nuha AlWadaani.
March 1, 2009 Dr. Muhammed Al-Mulhem 1 ICS 482 Natural Language Processing INTRODUCTION Muhammed Al-Mulhem March 1, 2009.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
Grammars and Parsing. Sentence  Noun Verb Noun Noun  boys Noun  girls Noun  dogs Verb  like Verb  see Grammars Grammar: set of rules for generating.
Compiler Construction 1. Objectives Given a context-free grammar, G, and the grammar- independent functions for a recursive-descent parser, complete the.
111 Computability, etc. Recap. Present homework. Grammar and machine equivalences. Turing history Homework: Equivalence assignments. Presentation proposals,
Lecture 12: 22/6/1435 Natural language processing Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
1 CS 385 Fall 2006 Chapter 14 Understanding Natural Language (omit 14.4)
1.Syntax: the rules of sentence formation; the component of the mental grammar that represent speakers’ knowledge of the structure of phrase and sentence.
For Friday Finish chapter 23 Homework: –Chapter 22, exercise 9.
Parsing arithmetic expressions Reading material: These notes and an implementation (see course web page). The best way to prepare [to be a programmer]
Natural Language Processing Lecture 6 : Revision.
Understanding Natural Language
10. Parsing with Context-free Grammars -Speech and Language Processing- 발표자 : 정영임 발표일 :
Context-Free Parsing Read J & M Chapter 10.. Basic Parsing Facts Regular LanguagesContext-Free Languages Required Automaton FSMPDA Algorithm to get rid.
PHRASE STRUCTURE GRAMMARS RTNs ATNs Augmented phrase structure rules / trees.
Copyright © Curt Hill Languages and Grammars This is not English Class. But there is a resemblance.
NLP ? Natural Language is one of fundamental aspects of human behaviors. One of the final aim of human-computer communication. Provide easy interaction.
Syntax Why is the structure of language (syntax) important? How do we represent syntax? What does an example grammar for English look like? What strategies.
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
Center for PersonKommunikation P.1 Background for NLP Questions brought up by N. Chomsky in the 1950’ies: –Can a natural language like English be described.
Artificial Intelligence: Natural Language
Chart Parsing and Augmenting Grammars CSE-391: Artificial Intelligence University of Pennsylvania Matt Huenerfauth March 2005.
1 Context Free Grammars October Syntactic Grammaticality Doesn’t depend on Having heard the sentence before The sentence being true –I saw a unicorn.
CSE573 Autumn /23/98 Natural Language Processing Administrative –PS3 due today –PS4 out Wednesday, due Friday 3/13 (last day of class) special.
1 Language translation Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Sections
1 Recursive Transition Networks Allen ’ s Chapters 3 J&M ’ s Chapter 10.
CPSC 422, Lecture 27Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27 Nov, 16, 2015.
1 Syntax 1. 2 In your free time Look at the diagram again, and try to understand it. Phonetics Phonology Sounds of language Linguistics Grammar MorphologySyntax.
Natural Language Processing Slides adapted from Pedro Domingos
1 Natural Language Processing Lecture 6 Features and Augmented Grammars Reading: James Allen NLU (Chapter 4)
NATURAL LANGUAGE PROCESSING
CS : Language Technology for the Web/Natural Language Processing Pushpak Bhattacharyya CSE Dept., IIT Bombay Parsing Algos.
Formal grammars A formal grammar is a system for defining the syntax of a language by specifying sequences of symbols or sentences that are considered.
ADTS, GRAMMARS, PARSING, TREE TRAVERSALS Lecture 13 CS2110 – Spring
By Kyle McCardle.  Issues with Natural Language  Basic Components  Syntax  The Earley Parser  Transition Network Parsers  Augmented Transition Networks.
Week 3. Clauses and Trees English Syntax. Trees and constituency A sentence has a hierarchical structure Constituents can have constituents of their own.
10/31/00 1 Introduction to Cognitive Science Linguistics Component Topic: Formal Grammars: Generating and Parsing Lecturer: Dr Bodomo.
Formal Languages and Automata FORMAL LANGUAGES FINITE STATE AUTOMATA.
Syntax 1.
System Software Unit-1 (Language Processors) A TOY Compiler
ASTs, Grammars, Parsing, Tree traversals
Chapter Eight Syntax.
Natural Language Understanding
CS : Speech, NLP and the Web/Topics in AI
Natural Language Processing (NLP)
Chapter Eight Syntax.
Artificial Intelligence 2004 Speech & Natural Language Processing
Language translation Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Sections
Presentation transcript:

Transition Network Grammars for Natural Language Analysis - W. A. Woods In-Su Yoon Pusan National University School of Electrical and Computer Engineering Pusan National University

What’s a grammar? A grammar contains the knowledge about “legal” syntactic structure, represented as rewrite rules. The grammar defines the language. Pusan National University

Here’s a very simple example S  NP VP VP  VERB NP NP  ART NOUN NP  POSS NOUN My dog ate a frog. NOUN  frog | dog ART  a VERB  ate POSS  my Pusan National University

Parse tree Pusan National University S NPVP NOUNVERB NOUN POSSNP ART mydogateafrog The tree notation is difficult to compute with directly, so we can convert the representation into more useful: (S (NP (POSS my) (NOUN dog)) (VP (VERB ate) (NP (ART a) (NOUN frog))))

From grammars to transition nets As grammars get more rules, they become difficult to understand and computationally more demanding. We can make our jobs easier by converting the grammar to a more convenient representation known as a FSM or transition network (TN). Pusan National University

Transition Network Pusan National University S0S1S2 NPVP VP0VP1VP2 VERBNP NP0NP1NP2 ARTNOUN NP3NP4NP5 POSSNOUN NP0NP1NP2 ART NOUN POSS S0S1S2 ART NOUN POSS S3S4S5 ART NOUN POSS VERB When the lexicon gets really big, drawing them takes forever!

Why do we want recursive rules in our grammar? Natural languages allow us to express an infinite range of ideas using a finite set of rules and symbols. The boy drowned. The boy with the raft drowned. The boy with the raft near the island drowned. The boy with the raft near the island in the ocean drowned. The boy.... Pusan National University

From transition nets to recursive transition nets The ability to push a destination on a stack, jump to a subnetwork, and return to the pushed destination. Pusan National University

A sample recursive transition network(1) Pusan National University The boy broke the window with a hammer.

A sample recursive transition network(2) Pusan National University

RTNs are not enough RTNs are interesting from a theoretical standpoint but are not of much use themselves. From a computational perspective, a black box that accepts English input and just say “yes” or “no” doesn’t buy us much. What we need is a black box that records the structure of the input as well as providing an evaluation of syntactic correctness. Pusan National University

Augmented Transition Network (ATN) (1) Adding procedures to the arcs of the RTN. These procedures are then performed when the corresponding arcs are traversed. The resulting network is called an “augmented transition network” (ATN). Pusan National University

Augmented Transition Network (ATN) (2) One of the things we can do with these added procedures or augmentations is to store information in registers when arcs are traversed. To record the structure of the input, we add an action to each arc which stores the word that was processed while traversing that arc in an appropriate register. Pusan National University

Register assignment (1) If we now process the noun phrase “the vicious dog” with the ATN. We’ll have made the following register assignments: ART = the ADJS = vicious NOUN = dog Pusan National University

Register assignment (2) When we take the “pop” arc from this network, we can then accumulate the register contents into a larger structure that we call NP : (NP (ART the) (ADJS vicious) (NOUN dog)) Pusan National University

“the vicious dog ate the wimpy frog” the resulting structure would be : (S(SUBJ(NP(ART the) (ADJS vicious) (NOUN dog))) (VERB ate) (OBJ(NP(ART the) (ADJS wimpy) (NOUN frog)))) Pusan National University

Conclusion The great majority of natural language processing (NLP) systems in actual use in the world today are based on the ATN formalism. They are often used as front-ends or interfaces to database systems for question answering. One often cited example of a system that followed this approach successfully is a program called LUNAR, written by William Woods. They are often designed to perform both the syntactic and semantic analysis at the same time. Pusan National University