Natural Language Processing with Qt

Slides:



Advertisements
Similar presentations
NEURAL NETWORKS Biological analogy
Advertisements

A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
NEURAL NETWORKS Backpropagation Algorithm
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
Part of Speech Tagging Importance Resolving ambiguities by assigning lower probabilities to words that don’t fit Applying to language grammatical rules.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Machine Learning Neural Networks
1 Part I Artificial Neural Networks Sofia Nikitaki.
The back-propagation training algorithm
IT 691 Final Presentation Pace University Created by: Robert M Gust Mark Lee Samir Hessami Mark Lee Samir Hessami.
Neural Networks Lab 5. What Is Neural Networks? Neural networks are composed of simple elements( Neurons) operating in parallel. Neural networks are composed.
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Supervised Learning: Perceptrons and Backpropagation.
Artificial Intelligence
Artificial neural networks:
1 Introduction to Artificial Neural Networks Andrew L. Nelson Visiting Research Faculty University of South Florida.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Artificial Neural Networks
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
 The most intelligent device - “Human Brain”.  The machine that revolutionized the whole world – “computer”.  Inefficiencies of the computer has lead.
Neural Networks Chapter 6 Joost N. Kok Universiteit Leiden.
Neural Networks AI – Week 23 Sub-symbolic AI Multi-Layer Neural Networks Lee McCluskey, room 3/10
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Intelligence Techniques Multilayer Perceptrons.
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
PARALLELIZATION OF ARTIFICIAL NEURAL NETWORKS Joe Bradish CS5802 Fall 2015.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Supervised Learning. Teacher response: Emulation. Error: y1 – y2, where y1 is teacher response ( desired response, y2 is actual response ). Aim: To reduce.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.
Introduction to Neural Networks Freek Stulp. 2 Overview Biological Background Artificial Neuron Classes of Neural Networks 1. Perceptrons 2. Multi-Layered.
C - IT Acumens. COMIT Acumens. COM. To demonstrate the use of Neural Networks in the field of Character and Pattern Recognition by simulating a neural.
Neural Networks 2nd Edition Simon Haykin
Perceptrons Michael J. Watts
Previous Lecture Perceptron W  t+1  W  t  t  d(t) - sign (w(t)  x)] x Adaline W  t+1  W  t  t  d(t) - f(w(t)  x)] f’ x Gradient.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
1 Neural Networks MUMT 611 Philippe Zaborowski April 2005.
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
Fundamental ARTIFICIAL NEURAL NETWORK Session 1st
Neural networks.
Big data classification using neural network
CS 388: Natural Language Processing: LSTM Recurrent Neural Networks
Deep Learning Amin Sobhani.
Learning with Perceptrons and Neural Networks
Artificial neural networks:
CSE 473 Introduction to Artificial Intelligence Neural Networks
Neural Networks Dr. Peter Phillips.
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
Networks: Neural Networks
CSE P573 Applications of Artificial Intelligence Neural Networks
NEURAL NETWORKS Lab 1 dr Zoran Ševarac FON, 2013.
CSE 473 Introduction to Artificial Intelligence Neural Networks
Prof. Carolina Ruiz Department of Computer Science
Implementing AI solutions using the cognitive services in Azure
of the Artificial Neural Networks.
CSE 573 Introduction to Artificial Intelligence Neural Networks
network of simple neuron-like computing elements
Artificial Intelligence Lecture No. 28
Capabilities of Threshold Neurons
Backpropagation.
Principles of Computing – UFCFA3-30-1
Artificial Neural Networks
ARTIFICIAL NEURAL networks.
CS621: Artificial Intelligence Lecture 22-23: Sigmoid neuron, Backpropagation (Lecture 20 and 21 taken by Anup on Graphical Models) Pushpak Bhattacharyya.
Bidirectional LSTM-CRF Models for Sequence Tagging
Machine Learning.
Prof. Carolina Ruiz Department of Computer Science
Outline Announcement Neural networks Perceptrons - continued
Presentation transcript:

Natural Language Processing with Qt Sebastiano Galazzo

Summary Introduction to AI terms What it is NLP? Behind NLP Neural Networks QNeuralNetwork Demo Language analysys Third party service Microsoft LUIS QLuis

Introduction to AI terms Machine Learning: As branch of Artificial Intelligence is set of algorithms based on self-improving performance, analyzing data Artificial Intelligence: The ability of a software to perform typical human abilities and reasoning. Mix of machine learning algorithms debated among philosophers and scientists (not obvious) Cognitive Computing: Tech platform based on AI and DSP including Machine Learning NLP Vision

What’s NLP? Is a field of Artificial Intelligence as a mix of ML algorithms Involved in natural language understanding Enable computers to derive meaning from natural language input

What it is NLP? The best way to describe, is a demo Comune di Solarino AI Bot What it is NLP?

Supervised learning algorithm The goal of any supervised learning algorithm To find a function that best maps a set of inputs to its correct output An example would be a classification task Text analysis (Sentiment) Image classification

Neural Networks Neural Networks are Mathematical models inspired to biological brain neural connections. Multi-Layers Neural Networks solve linearly separable problems.

Neural Networks Akinator

Neural Networks - Backpropagation BP is a supervised learning algorithm Based on two phases Propagation Weight update

Neural Networks - Backpropagation Propagation involves the following steps: Step 1: Forward propagation of a training pattern's input through the neural network in order to generate the propagation's output activations.

Neural Networks - Backpropagation Step 2: Backward propagation of the propagation's output activations through the neural network using the training pattern target in order to generate the deltas (the difference between the targeted and actual output values) of all output and hidden neurons

Neural Networks - Backpropagation Phase 2: Weight update, for each weight-synapse follow the following steps: Multiply its output delta and input activation to get the gradient of the weight. Subtract a ratio (percentage) from the gradient of the weight.

Neural Networks in Qt Trainer Reader

Syntagmas In linguistics, a syntagma is an elementary constituent segment within a text. Such a segment can be a phoneme, a word, a grammatical phrase, a sentence Syntagmatic structure in a language is the combination of words according to the rules of syntax for that language

Syntagmas English uses determiner + adjective + noun the big house Another language might use determiner + noun + adjective (Italian, Spanish) la casa grande

Syntagmizer Tokenize the sentence Tag tokens based on language syntax rules Sentence::tag() and verbs to infinite form Assembles syntagmas based on step 2

Syntagmizer Has several rules to apply (sentence.cpp) I has been happy last night Happy should be labeled as adjective (A) but the label before is a verb, then is joined with previous Has been happy -> to be happy Syntagmizer in Qt

Microsoft LUIS luis.ai

QLuis QLuis Demo GitHub

Sebastiano Galazzo Thank you! @galazzoseba sebastiano.galazzo@gmail.com