Presentation By: Eryk Helenowski PURE Mentor: Vincent Bindschaedler

Slides:



Advertisements
Similar presentations
Cognitive Systems, ICANN panel, Q1 What is machine intelligence, as beyond pattern matching, classification and prediction. What is machine intelligence,
Advertisements

MACHINE LEARNING AND ARTIFICIAL NEURAL NETWORKS FOR FACE VERIFICATION
Richard Socher Cliff Chiung-Yu Lin Andrew Y. Ng Christopher D. Manning
Artificial Neural Network Theory and Application Ashish Venugopal Sriram Gollapalli Ulas Bardak.
 The most intelligent device - “Human Brain”.  The machine that revolutionized the whole world – “computer”.  Inefficiencies of the computer has lead.
© Copyright 2004 ECE, UM-Rolla. All rights reserved A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C.
From Machine Learning to Deep Learning. Topics that I will Cover (subject to some minor adjustment) Week 2: Introduction to Deep Learning Week 3: Logistic.
Connectionist Models of Language Development: Grammar and the Lexicon Steve R. Howell McMaster University, 1999.
Text Summarization via Semantic Representation 吳旻誠 2014/07/16.
Machine Learning in CSC 196K
Pattern Recognition. What is Pattern Recognition? Pattern recognition is a sub-topic of machine learning. PR is the science that concerns the description.
語音訊號處理之初步實驗 NTU Speech Lab 指導教授: 李琳山 助教: 熊信寬
Parsing Natural Scenes and Natural Language with Recursive Neural Networks INTERNATIONAL CONFERENCE ON MACHINE LEARNING (ICML 2011) RICHARD SOCHER CLIFF.
Chapter 13 Artificial Intelligence. Artificial Intelligence – Figure 13.1 The Turing Test.
Mastering the Pipeline CSCI-GA.2590 Ralph Grishman NYU.
Bassem Makni SML 16 Click to add text 1 Deep Learning of RDF rules Semantic Machine Learning.
A Sentence Interaction Network for Modeling Dependence between Sentences Biao Liu, Minlie Huang Tsinghua University.
Usman Roshan Dept. of Computer Science NJIT
Brief Intro to Machine Learning CS539
Big data classification using neural network
Jonatas Wehrmann, Willian Becker, Henry E. L. Cagnini, and Rodrigo C
4/19/ :02 AM © Microsoft Corporation. All rights reserved. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN.
Reinforcement Learning
Convolutional Neural Network
CS 388: Natural Language Processing: LSTM Recurrent Neural Networks
CS 4501: Introduction to Computer Vision Computer Vision + Natural Language Connelly Barnes Some slides from Fei-Fei Li / Andrej Karpathy / Justin Johnson.
Machine Learning for Big Data
Deep Learning Amin Sobhani.
Natural Language and Text Processing Laboratory
Randomness in Neural Networks
PART IV: The Potential of Algorithmic Machines.
Recurrent Neural Networks for Natural Language Processing
COMP24111: Machine Learning and Optimisation
Show and Tell: A Neural Image Caption Generator (CVPR 2015)
Artificial Intelligence fuelled Sentiment Analysis
Deep Learning Fundamentals online Training at GoLogica
Intelligent Information System Lab
Deep Neural Networks based Text- Dependent Speaker Verification
Different Units Ramakrishna Vedantam.
Natural Language Processing of Knee MRI Reports
Deep learning and applications to Natural language processing
Lecture 5 Smaller Network: CNN
Convolutional Networks
Shunyuan Zhang Nikhil Malik
Efficient Estimation of Word Representation in Vector Space
Layer-wise Performance Bottleneck Analysis of Deep Neural Networks
Convolutional Neural Networks
Introduction to Neural Networks
A Comparative Study of Convolutional Neural Network Models with Rosenblatt’s Brain Model Abu Kamruzzaman, Atik Khatri , Milind Ikke, Damiano Mastrandrea,
Paraphrase Generation Using Deep Learning
Image Captions With Deep Learning Yulia Kogan & Ron Shiff
Recurrent Neural Networks
Machine Learning 101 Intro to AI, ML, Deep Learning
Word embeddings based mapping
Word embeddings based mapping
The Big Health Data–Intelligent Machine Paradox
Vector Representation of Text
Neural networks (3) Regularization Autoencoder
Word embeddings (continued)
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
Natural Language Processing (NLP) Systems Joseph E. Gonzalez
Attention for translation
Learn to Comment Mentor: Mahdi M. Kalayeh
Adversarial Machine Learning in Image Recognition
Machine Learning for Space Systems: Are We Ready?
Automatic Handwriting Generation
The experiments based on Recurrent Neural Networks
Bidirectional LSTM-CRF Models for Sequence Tagging
Vector Representation of Text
Huawei CBG AI Challenges
Presentation transcript:

Presentation By: Eryk Helenowski PURE Mentor: Vincent Bindschaedler PURE Midterm Report Mimicking Black Box Sentence Level Text Classification Models Presentation By: Eryk Helenowski PURE Mentor: Vincent Bindschaedler

Overview Artificial Intelligence- perform tasks that normally require human intelligence such as visual perception, speech recognition, etc. Machine Learning- type of AI the provides computers with the ability to learn without being explicitly programmed. Focuses on programs that change when exposed to new data Deep Learning- class of Machine Learning algorithms Natural Language Processing (NLP)- the ability of a computer program to understand human speech Machine Learning as a Service- companies offer ML services in the form of REST APIs. Determine susceptibility of MLaaS services to be copied by machine learning techniques

Text Classification is a foundation of many NLP applications Typically for Text Classifications, a Recursive Neural Network is used Proven efficient for constructing sentence structure representations Constructing textual tree is at least O(n2) and tree-like structure is unsuitable for modeling long sentences or documents Recurrent Neural Network has runtime complexity of O(n) analyzes text word by word and stores the semantics of all previous text in a fixed sized hidden layer Biased model, later words are more dominant than earlier words Convolutional Neural Network (CNN) unbiased model, which can fairly determine discriminative phrases in text, also O(n) runtime complexity “Window size” problem, smaller windows -> loss of info, larger window  -> huge parameter space Suggest use of Recurrent Convolutional Neural Network Represent words using numbers and a left and right context Softmax function on output layer Improvements when accounting for context with recurrent structure, and constructs the representation of text using a convolutional neural net

Convolutional Neural Network (CNN)

Recursive Neural Tensor Network (RNTN)

Recurrent Network (RNN)

Techniques Recurrent Neural Network- contains cycles, often times used for time series data A more natural fit than convolution neural networks, which are more widely used in image processing LTSM- Long Short Term Memory network Type of RNN capable of learning long term dependencies, something RNNs struggle with Label the training data using some text classification API, then use this to train my model, and then test how effectively my model mimics the behavior of theirs

Progress Native Bayes Classifier DATA PREPROCESSING TRAINING DATA

Goals Implement a more heavily involved data preprocessing step that allows me to move from a naïve Bayes classifier to a LTSM Recurrent Network One to one mapping vocabulary? Vectorized representation of words? Determine various performance metrics to determine how well I am mimicking the Aylien API Extend upon a simple LTSM network to explore the effects of different architectures