Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introduction to Neural Networks and Example Applications in HCI Nick Gentile.

Similar presentations


Presentation on theme: "Introduction to Neural Networks and Example Applications in HCI Nick Gentile."— Presentation transcript:

1 Introduction to Neural Networks and Example Applications in HCI Nick Gentile

2 Overview Introduction to neural networks Example Applications in HCI –NETtalk –Cognitive Text-editing –Filtering Messages –Learning a User’s Interests by Observing their behavior Discussion

3 Neural Networks The brain –Cerebral cortex Memory, perceptual awareness, thinking, language and consciousness Made up of billions of neurons Motivation –“Understanding human behavior and brain construction…” Computers have the power –Using the power of computers we can better understand the underlying processes of the brain

4 Neural Networks Three key elements –Network architecture Typically described in terms of layers Connectionist model (connection strengths) –“Each connection has associated with it a numerical weight. Each neuron's output is a single numerical activity which is computed as a monotonic function of the sum of the products of the activity of the input neurons with their corresponding connection weights.“ –Learning algorithm How the weights are set –Data representation Inputs and outputs

5 Neural Networks Some useful terms –Feedforward network - “A layered network in which each layer only receives inputs from previous layers.” –Target vector - “The desired output vector for a given input vector.” –Pattern recognition - “The task performed by a network trained to respond when an input vector close to a learned vector is presented. The network “recognizes” the input as one of the original target vectors.” –Error vector - “The difference between a network’s output vector in response to an input vector and an associated target output vector.”

6 NETtalk Training –Sequences and pronunciations Gives good representation, but not good enough Further training handles exceptions and special cases How it works –Seven letter window Looks at middle letter and uses rest as “[context]” to determine output (phonetic symbols) Window moves through whole document generating a sequence of phonetic symbols

7 NETtalk Results –Network learned to abstract exceptions and special cases to produce more accurate speech –Eventually was able to produce accurate speech from words it had never seen before

8 NETtalk Architecture –Three-layered feed-forward network Representation –Inputs - Sequence of letters –Outputs - Sounds Learning –Backpropagation

9 Cognitive Text-editing Goal –Predict text editing strategies How –By taking into account keystrokes and the pauses between them –“Robertson and Black have shown that the pauses which occur during text editing are indicators of the formulation of planning strategies.”

10 Cognitive Text-editing Training –Subjects asked to write a memo using the ‘vi-editor’ –Inputs - 36 vi commands and three types of pauses (long, short and intermediate) –Outputs - Editing goals (Address Memo, Puncture Memo, Organise Memo, Enhance Memo, Review Memo and Error Correction). Manually input for training purposes. Outcome –36 memos written, 12 used for training 24 used for testing –Network was able to recognize 96% of the test data

11 Cognitive Text-editing Architecture –Feed-forward network Representation –Inputs - vi commands and pauses between keystrokes –Outputs - Editing goals Learning –Backpropagation

12 Filtering Messages The problem –Eliminate irrelevant messages from newsgroup The solution –Generate two dictionaries “Common Dictionary” “Deference Dictionary” –Construct a neural net # input nodes = # of words the dictionary # output nodes = # of subtopics or categories

13 Filtering Messages Message - Hi Nick what’s up hi nick s up what Dictionary do hci hi network nick what Input vector - [0.1, 0.1, 0.9, 0.1, 0.9, 0.9]

14 Filtering Messages Architecture –Three-layered, feed-forward network Representation –Input - email and newsgroup messages –Output - relevant messages Learning –Backpropagation

15 Learning a User’s Interests by Observing their behavior Task –To predict weather or not a particular page is of interest, based on the user’s profile Methodology –Obtain training examples by recording user navigation behavior –Use this information to predict user interest in a page

16 Learning a User’s Interests by Observing their behavior Implementation –Three output neurons # of hyperlinks clicked –Fraction of hyperlinks clicked on a page Scrolling activity and mouse activity –Counts scaled by 100 Result –Based on page content # of hyperlinks user will click Amount of scrolling Amount of mouse activity

17 Learning a User’s Interests by Observing their behavior Architecture –Feed-forward network Representation –Input - IE logfile –Output - User’s interest in a page Learning –Backpropagation

18 Final Thought One of the main goals of HCI is to model user behavior in order to gain a better understanding of how they interact with computers. They can then take that understanding and apply it to new and existing applications to make them more usable. So what better way to understand human behavior than to exploit the very mechanism that guides that behavior?

19 References Anderson, J. A. (1995), An Introduction to Neural Networks, Cambridge, MA: MIT Press. Yasdi, R. (2000). "A Literature Survey on Applications of Neural Networks for Human-Computer Interaction." Neural Computing & Applications 9(4): 245-258. Cool site to learn about and play with various types of networks. http://diwww.epfl.ch/mantra/tutorial/english/ http://diwww.epfl.ch/mantra/tutorial/english/


Download ppt "Introduction to Neural Networks and Example Applications in HCI Nick Gentile."

Similar presentations


Ads by Google