Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introduction.  “a technique that enables the computer to encode complex grammatical knowledge such as humans use to assemble sentences, recognize errors.

Similar presentations


Presentation on theme: "Introduction.  “a technique that enables the computer to encode complex grammatical knowledge such as humans use to assemble sentences, recognize errors."— Presentation transcript:

1 Introduction

2  “a technique that enables the computer to encode complex grammatical knowledge such as humans use to assemble sentences, recognize errors and make corrections”

3  CALL  Hardcoded instructions  Pre-configured assessment items  Pre-specified mapping between learner response and error category  ICALL  Adaptive instructions  Dynamic assessment item generation  Automated mapping using NLP techniques

4  CALL  Teacher centric rather than learner centric  Explosion in learner responses  Explicit learner response to error mapping not feasible  Highly constrained learner responses  Not sufficient for self-learning  ICALL  Abstract away from specific string entered by learner to more general classes of properties  Generation of feedback, learner modeling, instructional sequencing can be based on small number of abstract properties  NLP systems are not robust

5 Student NLP Feedback Designer LM Learner Modeling Instructional Sequencing Feedback Response Instruction Tutoring System

6  In form-focused ICALL, the interaction workflow proceeds as follows:  In response to some prompt or question by the tutor, the student enters a sentence  The sentence is forwarded to the parser for analysis of syntax validity  The sentence passes the syntax validity check or  The parser will fail in case the learner response is ill- formed  The error is classified into generic error classes  The error handler generate appropriate feedback to be presented to the learner.

7  Tutoring subsystem  moderation of parser output  withholding information  alerting the student that something is wrong  highlighting the location of errors  classifying the errors  correction or hint on errors  showing the structural analysis of the sentence  assign score against the learner response  revisiting instruction sequence dynamically  update student model

8  Vocabulary learning  Diagnosis of learner error  Correcting learner errors  Language learning exercise generation

9  Lexical Hypothesis: Speaking is essentially lexicon driven (Levelt, 1989)  Grammatical form can only be activated once a lexical item has been chosen  Lexical items need to have a rich internal structure ▪ Meaning, syntactic, morphological and phonological properties  Learning types  Intentional  Incidental

10 “An intelligent word-based language learning assistant” – Nerbonne and Dokter University of Groningen

11  The task  Given a learner provided response, mark the errors  Need to parse learner response  Erroneous sentences are ill-formed  Parsers expects the input sentence to be well- formed.  Parsers should show tolerance to error

12  Overgenerate and rank  Imposing ranking constraints on grammatical rule violation  Mal-rules to allow parsing with specific errors  Parse fitting  Generate fragmented parse trees and try to fit them together  Do not allow analysis of completely arbitrary ungrammatical input

13  Issues with English Language Learners (ELL)  Concentration of errors are much higher than native learners.  Using proofreading tools (e.g. MS Word)? ▪ Designed for native users ▪ Not very robust against foreign learner errors ▪ Targeted errors are small subset of learner errors

14  Error correction in machine translation output  Data driven approach  Classification approach ▪ Whether an article will be followed by a noun? ▪ Whether an article appearing before noun is correct? ▪ What would be the correct article?  Language modelling approach ▪ Errors will most likely be located in the area with low LM score  A hybrid system?

15  Influence of L1  No equivalent for a feature. ▪ Japanese and Russians face difficulty in learning articles  Languages sharing features ▪ German and French learners find it easy to learn English article systems  Transfer problem ▪ Positive transfer ▪ Negative transfer

16  Spelling errors  Article usage  Preposition usage  Collocation errors

17  Negative transfer  Correspondence between prepositions of any two languages is many-to-many  घर पर  at home, सड़क पर  on the road  Prepositions imposes semantic variation  in the summer vs. during the summer  Argument of predicates  Nomilalization (removal of hazard vs remove the hazard)  Type of argument (book in the box vs book on the table)  Verb alteration (They loaded hay on the wagon vs They loaded the wagon with hay)

18  Phrasal Verbs (verb+particle)  Non-compositional  give vs give up  Particles can move (put the switch off)  Phrasal verbs often used with prepositions (give in to their demands)  Idioms (in the house vs on the house) This is Kalyani in the house with all your favourite tunes “in the house”  venue All the drinks were on the house “On the house”  free

19  Indefinite article depends on countability of nouns  Countable vs uncountable ▪ The price of a Spring Fest hoody is Rs. 700. ▪ The price of freedom is constant vigilance.  Syntactic  Some uncountable nouns can take indefinite article when attached with a preposition phrase (a knowledge of English)  Discourse  World knowledge

20  Learner Error Corpora  Grammatical Error Detection  Grammatical Error Correction  Evaluation of Error Detection/Correction System


Download ppt "Introduction.  “a technique that enables the computer to encode complex grammatical knowledge such as humans use to assemble sentences, recognize errors."

Similar presentations


Ads by Google