Presentation is loading. Please wait.

Presentation is loading. Please wait.

Multi-Paradigm Programming in Oz for Natural Language Processing Torbjörn Lager, 2003 Acknowledgement: Some of the slides are due to van Roy, Haridi and.

Similar presentations


Presentation on theme: "Multi-Paradigm Programming in Oz for Natural Language Processing Torbjörn Lager, 2003 Acknowledgement: Some of the slides are due to van Roy, Haridi and."— Presentation transcript:

1 Multi-Paradigm Programming in Oz for Natural Language Processing Torbjörn Lager, 2003 Acknowledgement: Some of the slides are due to van Roy, Haridi and Schulte

2 NLP in Oz 2 This Course  English or Swedish?  The course as a whole  Mostly self-studying  Exercises  Projects  Information  Course page  Mailing list  Why these lectures?  To get you started  To whet your appetite  To point to stumble blocks  Difficulties  Different backgrounds -> Multiple entry points to Oz  Different level of expertise  Highlights  Denys Duchier

3 Functional Programming in Oz

4 NLP in Oz 4 Defining a Function fun {Square X} X*X end % or equivalently Square = fun {$ X} X*X end x.x*x Lambda expression Anonymous function

5 NLP in Oz 5 Calling a Function X = {Square 3} % binds X to 9 X = {Square {Square 3}} % binds X to 81

6 NLP in Oz 6 Functional Programming  A program consists of functions (much like ordinary mathematical functions)  The main program itself is written as a function which receives the program's input as its arguments and delivers the program's output as its result  Typically the main function is defined in terms of other functions, which in turn are defined in terms of still more functions, until at the bottom level the functions are language primitives (built-in functions).

7 NLP in Oz 7 Example: Word Counter (WC) fun {CountWords S} {Count {String.tokens 32 S}} end TokenizerCounter

8 NLP in Oz 8 Lists [a b c] % a list with three elements [a] % a list with one element nil % the empty list a|b|c|nil % a different way of writing the first list a|b|X % a list with an unbound tail (stream) % also, a pattern

9 NLP in Oz 9 Strings  Strings are lists whose elements correspond to character codes. For example: "OZ 2.0”  is the list [79 90 32 50 46 48]  or equivalently [&O &Z & &2 &. &0]  There is also a ByteString datatype - a more economical representation for textual data

10 NLP in Oz 10 The Unification Operator '='  The simplest cases are bindings to values, e.g., X=[a b c], and variable-variable bindings, e.g., X=Y.  Unification is symmetric. For example, [a b c]=X means the same as X=[a b c].  Any two partial values can be unified. For example, unifying the two lists [a X] and [Y b] binds X to b and Y to a.  If the values are already equal, then unification does nothing. For example, [a b]=[a b] does nothing.  If the partial values are incompatible then they cannot be unified. For example, an attempt to unify the two lists [a b] and [a c] will raise a failure exception.  Unification can create cyclic structures, i.e., structures that refer to themselves. For example, the unification X=a|b|X. This creates a cyclic list.

11 NLP in Oz 11 Pattern Matching case of then … [] then … else … end  Can be used as a statement or as an expression

12 NLP in Oz 12 If-Then-Else if then … end if then … else … end if then … elseif … then … else … end  Can be used as a statement or as an expression

13 NLP in Oz 13 Example: Functional Append fun {Append Xs Ys} case Xs of nil then Ys [] X|Xs1 then X|{Append Xs1 Ys} end X = {Append [a b] [c d]}

14 NLP in Oz 14 Example: Naive Reverse fun {Reverse L} case L of nil then nil [] X|Xs then {Append {Reverse Xs} [X]} end X = {Reverse [a b c]}

15 NLP in Oz 15 Example: A Faster Reverse local fun {RevAux L Acc} case L of nil then Acc [] X|Xs then {RevAux Xs X|Acc} end in fun {Reverse L} {RevAux L nil} end end X = {Reverse [a b c]}

16 NLP in Oz 16 Example: Member fun {Member X Xs} case Xs of nil then false [] Head|Tail then Head == X orelse {Member X Tail} end X = {Member c [a b c d]}

17 NLP in Oz 17 Higher-Order Programming  A programming language supports higher-order programming if:  Functions can be passed as arguments to other functions  The result of applying a function to its arguments can be another function  Functions can be put in data structures

18 NLP in Oz 18 Examples: SquareList and DoubleList fun {SquareList Xs} case Xs of nil then nil [] X|Xs1 then X*X |{SquareList Xs1} end fun {DoubleList Xs} case Xs of nil then nil [] X|Xs1 then 2*X |{DoubleList Xs1} end

19 NLP in Oz 19 Example: Map fun {Map Xs F} case Xs of nil then nil [] X|Xs1 then {F X}|{Map Xs1 F} end X = {Map [1 2 3] Square} % binds X to [1 4 9] X = {Map [1 2 3] fun {$ X} X*X end} % same X = {Map [1 2 3] fun {$ X} 2*X end} % binds X to [2 4 6]

20 NLP in Oz 20 Examples: Sum and Product fun {Sum Xs} case Xs of nil then 0 [] X|Xs1 then X + {Sum Xs1} end fun {Product Xs} case Xs of nil then 1 [] X|Xs1 then X * {Sum Xs1} end

21 NLP in Oz 21 FoldR fun {FoldR Xs F Y} case Xs of nil then Y [] X|Xs1 then {F X {FoldR Xs1 F Y}} end X = {FoldR [1 2 3] Number.'+' 0} X = {FoldR [1 2 3] Number.'*' 1}

22 NLP in Oz 22 Some Operations in the List Module X = {Append [a b c] [d e]} % X = [a b c d e] X = {Reverse [a b c]} % X = [c b a] X = {Member b [a b c]} % X = true X = {Last [a b c]} % X = c X = {Length [a b c]} % X = 3 X = {List.subtract [a b c] b} % X = [a c] X = {List.flatten [a [a b] c]} % X = [a a b c] X = {Sort [c d b d a] Value.'<'} % X = [a b c d d] X = {Map [12 13 1] IntToFloat} % X = [12.0 13.0 1.0] X = {Filter [1 a 2 x] IsInt} % X = [1 2]....

23 NLP in Oz 23 Records  Record r1(numb:sg pers:1)  Unifying records A = r1(numb:_ B = r1(pers:_ pers:1) numb:sg) {Inspect A=B} % Inspector shows r1(numb:sg pers:1) {Inspect A.numb} % Inspector shows ’sg’ Label FeatureValue

24 NLP in Oz 24 Example: A Simple Lexicon local Lexicon = lexicon(’Sandy':[pn] ’Kim':[pn] loves:[vb] sees:[vb] ) in fun {LexLookup Word} Word#(Lexicon.Word) end end X = {Map [’Sandy’ loves ’Kim’] LexLookup} % X = [’Sandy'#[pn] loves#[vb] ’Kim'#[pn]]

25 NLP in Oz 25 Open Records and Record Constraints  Open records A = r1(numb:_ B = r1(pers:_ pers:1...) 'case':nom numb:sg...) {Inspect A=B} % r1('case':nom numb:sg pers:1...)  Record constraints R^cat = n R^agr^numb = sg R^agr^pers = 1 {Inspect R} % _(agr:_(numb:sg pers:1...) cat:n...)

26 NLP in Oz 26 Tuples T1 = t(a b c) T1 = t(1:a 2:b 3:c) % succeeds T1.3 = c % succeeds T2 = a#b#c % equivalent to ’#’(a b c)

27 NLP in Oz 27 Example: Compositional Semantics John = j Smiles = fun {$ X} smiles(X) end `John Smiles` = {Smiles John} John = fun {$ P} {P j} end Smiles = fun {$ X} smiles(X) end `John Smiles` = {John Smiles} `John Smiles` = {fun {$ P} {P j} end fun {$ X} smiles(X) end} % In all of these cases, `John Smiles` gets % bound to smiles(j)

28 A Bird’s View on All This

29 NLP in Oz 29 The Kernel Language Approach  A kernel language  Abstractions based on the kernel language and/or other abstractions. Examples: functions, loops, classes, ports, etc.  Linguistic abstraction/Syntactic sugar. Examples: function expressions, loop expressions, etc.  “Contrary to many (possibly most) languages, we always designed Oz starting from the semantics. The kernel language is carefully given formal semantics. For the more high-level aspects, we always asked the question: can we explain this novel thing of the language in terms of the base language. That's how we managed to erect the full system: by building up on sound semantic foundations; level by level.”

30 NLP in Oz 30 The Kernel Language Approach  Kernel languages have a small number of programmer-significant elements  Their purpose is to understand programming from the programmer’s viewpoint  They are given a semantics which allows the practicing programmer to reason about correctness and complexity at a high level of abstraction Full language Kernel language Foundational calculus Virtual machine For mathematicians For programmers For implementors

31 NLP in Oz 31 The Kernel Language Approach Full language Kernel language fun {Sq X} X*X end Z = {Sq {Sq X}} proc {Sq X Y} {Number.’*’ X X Y} end local X Y Z in {Sq X Y} {Sq Y Z} end

32 NLP in Oz 32 Functional Append -> Procedural Append fun {Append Xs Ys} case Xs of nil then Ys [] X|Xs1 then X|{Append Xs1 Ys} end proc {Append Xs Ys Zs} case Xs of nil then Zs = Ys [] X|Xs1 then Ys1 in Zs = X|Ys1 % or: X|Ys1 = Zs in {Append Xs1 Ys Ys1} end

33 Kernel Language (almost complete) skip 1 = 2 = 1 2 local in end if then 1 else 2 end case of then 1 else 2 end { 1 … n } thread end {ByNeed 1 2 } {NewName } try 1 catch then 2 end raise end choice 1 []... [] 2 end fail {NewCell 1 2 } {Exchange 1 2 3 } ::= Empty statement Variable-variable binding Variable-value binding Sequential composition Variable creation Conditional Pattern matching Procedure invocation Thread creation Trigger creation Name creation Exception context Raise exception Choice Failure Cell creation Cell exchange Encapsulated search

34 NLP in Oz 34 Primary Data Types

35 Computation Models Declarative model strict functional programming, e.g., Scheme deterministic logic programming + concurrency + by-need synchronization declarative concurrency lazy functional programming, e.g., Haskell + nondeterministic choice concurrent logic programming + exception handling + encapsulated state object-oriented programming + search nondeterministic LP, e.g., Prolog concurrent OOP ( active object style, e.g., Erlang) ( shared state style, e.g., Java) + computation spaces constraint programming  We show some of the relationships between the different models  Each model has its own kernel language, its own reasoning techniques, and its own programming techniques  The kernel languages are closely related, e.g., the declarative model is a subset of all of them

36 Computation Models and NLP Declarative model strict functional programming, e.g., Scheme deterministic logic programming + concurrency + by-need synchronization declarative concurrency lazy functional programming, e.g., Haskell + nondeterministic choice concurrent logic programming + exception handling + encapsulated state object-oriented programming + search nondeterministic LP, e.g., Prolog concurrent OOP ( active object style, e.g., Erlang) ( shared state style, e.g., Java) + computation spaces constraint programming  Viterbi statistical tagger  Chart parser  Word frequency lister  Compositional logical semantics interpreter  Finite-state tools

37 NLP in Oz 37 Example: The Viterbi Algorithm  Pure functional programming. Is this the optimal paradigm for this application?

38 NLP in Oz 38 The Mozart Programming Environment  Based on Emacs editor  + GUI tools  Inspector/Browser (visualize values)  Debugger  Profiler  Panel (resource usage)  Compiler Panel (compiler settings and environment),  Distribution Panel (distribution)  Explorer (interactive resolution of constraint problems)  Documentation  van Roy & Haridi: Appendix A  Manual

39 Concurrency Oriented Programming

40 NLP in Oz 40 The World is Concurrent!  Concurrent programs  Several activities execute simultaneously  A lot of software in use is concurrent  Operating systems, user interfaces, web servers, etc.  The need for NLP and concurrency?

41 NLP in Oz 41 Concurrency Can Be Difficult, but Does Not Have to Be…  Sequential x := 0 x := x + 1 print x  What happens here? X0 = 0 thread X1 = X0 + 1 end thread X2 = X1 + 1 end {Inspect S}  What happens here? x := 0 thread x := x + 1 end print x  What happens here? declare Xs S in thread Xs={List.number 0 100 1} end thread S={Map Xs fun {$ X} X*X end} end {Inspect S}

42 NLP in Oz 42 Threads  Thread creation: thread … end  Can be used as a statement or as an expression  Threads in Oz are dataflow threads, i.e., they suspend on availability of data  Excellent for synchronization!  Threads in Oz a lightweight

43 NLP in Oz 43 Concurrent Map declare F Xs Ys Zs fun {CMap Xs F} case Xs of nil then nil [] X|Xr then thread {F X} end|{CMap Xr F} end {Inspect thread {CMap Xs F} end} % Inspector displays _ Xs = 1|2|Ys % Inspector displays _|_|_ fun {F X} X*X end % Inspector displays 1|4|_ Ys = 3|Zs % Inspector displays 1|4|9|_ Zs = nil % Inspector displays [1 4 9]

44 NLP in Oz 44 Fibonacchi and the Panel  A demo to show how lightweight Oz threads are

45 NLP in Oz 45 Streams  A stream is a potentially unbounded list of messages, i.e., it is a list whose tail is an unbound dataflow variable. declare Xs Xs3 {Inspect Xs} Xs=0|1|2|3|4|Xs2 % Inspector shows 0|1|2|3|4|_ Xs2=5|6|7|Xs3 % Inspector shows 0|1|2|3|4|5|6|7|_  Important container!

46 NLP in Oz 46 Declarative Concurrency  Producer-consumer with dataflow fun {Prod N Max} if N<Max then N|{Prod N+1 Max} else nil end end fun {Cons Xs A} case Xs of X|Xr then {Cons Xr A+X} [] nil then A end end local Xs S in thread Xs={Prod 0 1000} end thread S={Cons Xs 0} end end  Prod and Cons threads share list Xs  Dataflow behavior of case statement (synchronizing on data availability) gives stream communication  No other concurrency control needed Xs ProdCons

47 NLP in Oz 47 Declarative Concurrency  Let us compare the sequential and concurrent versions  The result of the calculation is the same in both cases  So what is different?  Sequential version:  Results are produced in batch: the whole calculation is done and then all results are given at once  Concurrent version:  Results are produced incrementally, element by element local Xs S in thread Xs={Prod 0 1000} end thread S={Cons Xs 0} end end local Xs S in Xs={Prod 0 1000} S={Cons Xs 0} end

48 NLP in Oz 48 Transformation-Based Tagging: Small Part-of-Speech Tagging Example rules pos:NN>VB NN <- pos:DT@[-1] o.... input She decided to table her data lexicon data:NN decided:VB her:PN she:PN table:NN to:TO NP VB TO NN PN NN NP VB TO NN PN NNNP VB TO VB PN NN

49 NLP in Oz 49 Example: Incremental Brill Tagging declare Wd T1 T2 T3 Lex = lex(the:dt light:vb cloud:nn 'is':vb) {Inspect T3} proc {Tag N} T1^N = Lex.(Wd^N) % lexical lookup if T1^N == vb andthen T1^(N-1) == dt then % vb>nn <- dt@[-1] T2^N = nn else T2^N = T1^N end if T2^N == nn andthen T2^(N+1) == nn then % nn>jj <- nn@[-1] T3^N = jj else T3^N = T2^N end end /* Wd^1 = the thread {Tag 1} end % _(1:dt...) Wd^2 = light thread {Tag 2} end % _(1:dt...) Wd^3 = cloud thread {Tag 3} end % _(1:dt 2:jj...) Wd^4 = is thread {Tag 4} end % _(1:dt 2:jj 3:nn 4:vb...) Wd^5 = the thread {Tag 5} end % _(1:dt 2:jj 3:nn 4:vb 5:dt) */

50 Programming with Explicit State

51 NLP in Oz 51 Explicit State  How can we let a procedure learn from its past? That is, we would like the procedure to have some kind of internal memory, which helps it do its job. Memory is needed for procedure that can change their behavior and learn from their past. This kind of memory is called explicit state.  An explicit state in a procedure is a state whose lifetime extends over more than one procedure call without being present in the procedure’s arguments.

52 NLP in Oz 52 Cells  Create a new cell C with initial content X: {NewCell X C}  Atomically bind X to the old content of cell C and set Y to be the new content: {Exchange C X Y}  Bind X to the current content of cell C: X = {Access C} % Book variant: X = @C  Set X to be the new content of cell C: {Assign C X} % Book variant: C := X  Warning: For pedagogical reasons, the book uses a very slightly different syntax than the publicly-released Mozart system. An upcoming Mozart release will remove this difference and be fully syntax compatible with the book.

53 NLP in Oz 53 More Explicit State: Arrays  Returns a new array with indices from I to J, inclusive, all initialized to X: A={Array.new I J X}  Puts in A the mapping of I to X: {Array.put A I X} % Or: A.I := X  Returns from A the mapping of I: X={Array.get A I} % Or: A.I  Note: Arrays can be implemented with tuples and cells (they aren’t, but thay can)

54 NLP in Oz 54 Even more Explicit State: Dictionaries  Returns a new empty dictionary: D={Dictionary.new}  Sets the item in dictionary D under key K to X : {Dictionary.put D K X} % Or: D.K := X  Returns the item X of dictionary D under key K : X={Dictionary.get D K} % Or: X=D.K  plus a lot more, of course…

55 NLP in Oz 55 IO F = {New Open.file init(name:'test.txt')} S = {F read(list:$ size:all)} {F close} F = {New Open.file init(url:'http:www.ling.gu.se/~lager')} S = {F read(list:$ size:all)} {F close} F = {New Open.file init(name:´test.txt´ flags:[write create truncate])} {F write(vs:´This comes in the file\n´)} {F write(vs:´The result of 43*43 is ´#43*43#´\n´)} {F write(vs:"Strings are ok too\n")} {F close}

56 NLP in Oz 56 A Parenthesis: The Significance of $  Converts statements to expressions  In a procedure call {P X Y Z} Y={P X $ Z}  Compare (again) fun {Sq X} X*X end Sq = fun {$ X} X*X end

57 NLP in Oz 57 Lazy Functional Programming fun lazy {Gen N} N|{Gen N+1} end Xs = {Gen 0} {Inspect Xs} % shows _ {Inspect {Member 4 Xs}} % shows 'true' {Inspect Xs} % shows 0|1|2|3|4|_

58 NLP in Oz 58 Lazy Reading from Files fun {ReadListLazy S} F={New Open.file init(name:S)} fun lazy {ReadNext} L T I in {F read(list:L tail:T size:1024 len:I)} if I==0 then T=nil {F close} else T={ReadNext} end L end in {Finalize.register F proc {$ F} {F close} end} {ReadNext} end

59 NLP in Oz 59 Loops for X in Xs do {Inspect X} end for X in Xs Y in Ys do {Inspect X#Y} end for X in 1..200 do {Inspect X} end for X in Xs I in 1;I+1 do {Inspect I#X} end for X in Xs break:B do … if … then {B} … end Ys = for X in Xs collect:C do {C X*X} end

60 NLP in Oz 60 Example: Character Counting A = {Array.new 10 255 0} S = { ReadListLazy 'test.txt'} for C in S do {Array.put A C {Array.get A C}+1} % or: A.C := A.C + 1 end {Inspect {Array.toRecord chars A}} % chars(10:256 11:0 12:0 13:0 14:0 15:0 16:0 17:0 18:0,,,) {Inspect A.&a} % 782

61 Object Oriented Programming

62 NLP in Oz 62 Objects  Procedure with memory  Object is one convenient way of encapsulating explicit state  Only methods can access state  Important invariants can be secured  As well as very useful to model objects of the real world

63 NLP in Oz 63 Model for Objects  Methods are procedures  Have access to state  Restrict access to state  State of an object  Record of cells

64 NLP in Oz 64 Classes class Counter from BaseObject attr val meth init(Value) val <- Value end meth inc(Value) val <- @val + Value end meth browse {Inspect @val} end  Describe how objects are constructed  Initial state  Methods  Classes can be constructed from other classes by inheritance

65 NLP in Oz 65 Example: A Counter Class declare Counter local Attrs = [val] MethodTable = m(browse:MyInspect init:Init inc:Inc) proc {Init M S Self} init(Value)=M in (S.val):=Value end proc {Inc M S Self} X inc(Value)=M in X=@(S.val) (S.val):=X+Value end proc {MyInspect M S Self} browse=M {Inspect @(S.val)} end in Counter = {Wrap c(methods:MethodTable attrs:Attrs)} end

66 NLP in Oz 66 Example: Defining New fun {NewObject WClass InitialMethod} State Obj Class={Unwrap WClass} in State = {MakeRecord s Class.attrs} {Record.forAll State proc {$ A} {NewCell _ A} end} proc {Obj M} {Class.methods.{Label M} M State Obj} end {Obj InitialMethod} Obj end

67 NLP in Oz 67 More about OOP in Oz  Multiple inheritance  Mixing with other paradigms  Privacy  Self  First class messages  Delegation  Parametric classes

68 NLP in Oz 68 Parametric Classes fun {Incrementor Incr} class $ attr val meth init val <- 0 end meth incr val <- @val + Incr end meth browse {Inspect @val} end O = {New {Incrementor 2} init} {O incr} {O browse} % prints 4 in Inspectr

69 NLP in Oz 69 Default Args class Counter attr val meth init(Val<=0) val <- Val end meth inc(Value<=1) val <- @val + Val end meth browse {Inspect @val} end O = {New Counter init} {O incr} {O incr(3)} {O browse} % prints 4 in Inspector

70 NLP in Oz 70 Ports  A port has two operations: creating a channel and sending to it:  Create a new port with entry point P and stream S: {NewPort S P}  Append X to the stream corresponding to the entry point P: {Send P X}  Example: declare S P in {NewPort S P} {Inspect S} {Send P a} % Inspector displays the stream a|_ {Send P b} % Inspector displays the stream a|b|_

71 NLP in Oz 71 Ports  What will happen here? declare S P in {NewPort S P} {Inspect S} thread {Send P a} end thread {Send P b} end  This displays the stream a|b|_ or the stream b|a|_  I.e. there is an observable non-determinism P P C

72 NLP in Oz 72 Ports from Cells fun {NewPort Stream} {NewCell Stream} end proc {Send P M} Old New in {Exchange P Old New} Old = M|New end

73 NLP in Oz 73 Active objects  An active object is a concurrent entity to which any other active object can send messages. One AO one thread.  The active object reads the messages in arrival order and sequentially executes an action for each message  An active object’s behavior is defined by a class, just like a passive object  Active objects can be considered either as primitive or as defined with a thread, a passive object, and a communication channel  Creation: A={NewActive Class Init}  Similar to Erlang (except that Erlang isn’t OO). Tip: Listen to Joe Armstrong, « Concurrency Oriented Programming in Erlang ». Se course page for link!

74 NLP in Oz 74 Defining Active Objects  Define NewActive in terms of existing New (passive object creation) by adding one port and one thread fun {NewActive Class Init} S P Obj in {NewPort S P} Obj={New Class Init} thread for M in S do {Obj M} end end proc {$ M} {Send P M} end end For loop does dataflow synchronization (like case statement) Sending to a port causes message to appear on stream Port P is created together with stream S (dataflow list)

75 Relational Programming

76 NLP in Oz 76 Relational Programming  There can be any number of results to a call, either zero (no results), one, or more.  Which arguments are inputs and which are outputs can be different for each call.  Two new statements called “choice” and “fail”.  encapsulated search (search engines, also useful for Constraint programming)

77 NLP in Oz 77 Search Engines  Higher order functions returning…  One solution  All solutions  All solutions on demand (lazily)  All solutions within a certain depth in the search tree  All solutions within a certain time limit  Etc.  Search can be parallell (e.g. using more then one machine in a network)  Build your own search engine!

78 NLP in Oz 78 Example: Knowledge Representation and Inference Every man who whistles is happy John is a man John whistles Therefore: John is happy  x[(man(x) & whistles(x))  happy(x)] man(John) whistles(John) Therefore: happy(John)

79 NLP in Oz 79 Example: Knowledge Representation and Inference in Oz proc {Happy X} {Man X} {Whistles X} end proc {Man X} choice X = paul [] X = john end proc {Whistles X} choice X = mary [] X = john end {Inspect {Search.base.one Happy}} {Explore.one Happy} % [john] Search tree

80 NLP in Oz 80 Example: A Lexicon proc {Lookup Word Pos} choice Word = saw Pos = nn [] Word = saw Pos = vb [] Word = run Pos = vb end {Search.base.all proc {$ X} {Lookup saw X} end Y} % binds Y to [nn vb] {Search.base.all proc {$ X} {Lookup X vb} end Z} % binds Z to [saw run]

81 NLP in Oz 81 Using the Explorer % Sentence  Name Verb proc {Sentence S0 S} S1 in {Name S0 S1} {Verb S1 S} end % Name  john|mary proc {Name S0 S} choice S0 = john|S [] S0 = mary|S end % Verb  talks|walks proc {Verb S0 S} choice S0 = talkes|S [] S0 = walks|S end {Explorer.all proc {$ S} {Sentence [john walks] nil} end}

82 NLP in Oz 82 A Simple PATR Implementation  Search  Open Records  Record Constraints

83 NLP in Oz 83 Compositional Logical Semantics  Search  Higher Order Functions

84 NLP in Oz 84 Relational Append proc {Append Xs Ys Zs} choice Xs = nil Ys = Zs [] X|Xr=Xs X|Zr=Zs in {Append Xr Ys Zr} end {Inspect {Search.base.all proc {$ A} Xs#Ys = A in {Append Xs Ys [a b c]} end}} % [nil#[a b c] [a]#[b c] [a b]#[c] [a b c]#nil]  Compare Prolog: append([], L2, L2). append([X|M1], L2, [X|M3]) :- append(M1, L2, M3).

85 NLP in Oz 85 Relational Member (Choose) proc {Choose ?X Ys} choice Ys=X|_ [] Yr in Ys=_|Yr {Choose X Yr} end  Compare Prolog (where it is usually called member/2): choose(X,[X|_]). choose(X,[_|Xs]) :- choose(X,Xs).

86 NLP in Oz 86 A Prolog-Like DB class RelationClass attr d meth init d := {NewDictionary} end meth assert(I) if {IsDet I.1} then Is={Dictionary.condGet @d I.1 nil} in {Dictionary.put @d I.1 {Append Is [I]}} else raise databaseError(nonground(I)) end end meth query(I) if {IsDet I} andthen {IsDet I.1} then {Choose I {Dictionary.condGet @d I.1 nil}} else {Choose I {Flatten {Dictionary.items @d}}} end

87 NLP in Oz 87 Abstractions  What we’ve seen already:  Functions on top of procedures  Objects and Classes (on top of procedures and state)  Ports (on top of streams and state)  Active Objects (on top of ports and objects)  Prolog-style database (on top of search and state)  What more is there?:  Loops on top of higher order functions  Linda-style blackboard  Forward-chaining engine  Mobile Agents  Etc., etc.

88 NLP in Oz 88 Distribution  On one machine: X=the_novel(text:"It was a dark and stormy night....“ author:"E.G.E. Bulwer-Lytton“ year:1803) {Show {Connection.offerUnlimited X}}  On another machine: X2={Connection.take ´...ticket comes here...´}

89 NLP in Oz 89 How to Pickle a Value  The Pickle module provides procedures to store and retrieve stateless values on persistent storage (also called “serialization”), like so: R = unit(a:1 b:2 c:3) {Pickle.save R ”mypickle.ozp”} … R1 = {Pickle.load ”mypickle.ozp”}

90 NLP in Oz 90 Constraint Programming  SEND+MORE=MONEY proc {Money Sol} sol(s:S e:E n:N d:D m:M o:O r:R y:Y) = Sol in Sol ::: 0#9 {FD.distinct Sol} S \=: 0 M \=: 0 1000*S + 100*E + 10*N + D + 1000*M + 100*O + 10*R + E =: 10000*M + 1000*O + 100*N + 10*E + Y {FD.distribute ff Sol} end {Inspect {SearchAll Money}} % [sol(d:7 e:5 m:1 n:6 o:0 r:8 s:9 y:2)] {Explorer.all Money}

91 NLP in Oz 91 Pragmatics  Building Applications/CLIs  Building CGI scripts  Using Mogul  Building GUIs  Using the C++ Interface  Ozlets  Using ozmake  Using GUMP

92 NLP in Oz 92 Building Applications  Create functor(s)  Compile with ozc –c myapp.oz  Or ozc –x myapp.oz if main application…

93 NLP in Oz 93 Building GUIs  Tcl/Tk interface  QTk – built on top of the Tcl/Tk interface  Gtk+ (from version 1.2.5)


Download ppt "Multi-Paradigm Programming in Oz for Natural Language Processing Torbjörn Lager, 2003 Acknowledgement: Some of the slides are due to van Roy, Haridi and."

Similar presentations


Ads by Google