ECE/CS/ME 539 Artificial Neural Networks Final Project

Slides:



Advertisements
Similar presentations
SVM—Support Vector Machines
Advertisements

CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Visualization of hidden node activity in a feed forward neural network Adam Arvay.
Alberto Trindade Tavares ECE/CS/ME Introduction to Artificial Neural Network and Fuzzy Systems.
Machine Learning Neural Networks
1 Neural Networks - Basics Artificial Neural Networks - Basics Uwe Lämmel Business School Institute of Business Informatics
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
The back-propagation training algorithm
Final Project: Project 9 Part 1: Neural Networks Part 2: Overview of Classifiers Aparna S. Varde April 28, 2005 CS539: Machine Learning Course Instructor:
Announcements  Project proposal is due on 03/11  Three seminars this Friday (EB 3105) Dealing with Indefinite Representations in Pattern Recognition.
Analysis of Classification-based Error Functions Mike Rimer Dr. Tony Martinez BYU Computer Science Dept. 18 March 2006.
Document Classification Comparison Evangel Sarwar, Josh Woolever, Rebecca Zimmerman.
1 MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING By Kaan Tariman M.S. in Computer Science CSCI 8810 Course Project.
Rating Systems Vs Machine Learning on the context of sports George Kyriakides, Kyriacos Talattinis, George Stefanides Department of Applied Informatics,
General Mining Issues a.j.m.m. (ton) weijters Overfitting Noise and Overfitting Quality of mined models (some figures are based on the ML-introduction.
Gini Index (IBM IntelligentMiner)
Neural Networks in Data Mining “An Overview”
On the Application of Artificial Intelligence Techniques to the Quality Improvement of Industrial Processes P. Georgilakis N. Hatziargyriou Schneider ElectricNational.
Mohammad Ali Keyvanrad
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Artificial Neural Networks
Boosting Neural Networks Published by Holger Schwenk and Yoshua Benggio Neural Computation, 12(8): , Presented by Yong Li.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 23 Nov 2, 2005 Nanjing University of Science & Technology.
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
Chapter 7 Neural Networks in Data Mining Automatic Model Building (Machine Learning) Artificial Intelligence.
Artificial Intelligence Techniques Multilayer Perceptrons.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
Categorical data. Decision Tree Classification Which feature to split on? Try to classify as many as possible with each split (This is a good split)
CHAN Siu Lung, Daniel CHAN Wai Kin, Ken CHOW Chin Hung, Victor KOON Ping Yin, Bob SPRINT: A Scalable Parallel Classifier for Data Mining.
Classification and Prediction Compiled By: Umair Yaqub Lecturer Govt. Murray College Sialkot Readings: Chapter 6 – Han and Kamber.
Multi-Layer Perceptron
Logan Lebanoff Mentor: Haroon Idrees. Two-layer method  Trying a method that will have two layers of neural networks.
Using Machine Learning Techniques in Stylometry Ramyaa, Congzhou He, Dr. Khaled Rasheed.
1 Universidad de Buenos Aires Maestría en Data Mining y Knowledge Discovery Aprendizaje Automático 5-Inducción de árboles de decisión (2/2) Eduardo Poggi.
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
Decision Trees Binary output – easily extendible to multiple output classes. Takes a set of attributes for a given situation or object and outputs a yes/no.
 Based on observed functioning of human brain.  (Artificial Neural Networks (ANN)  Our view of neural networks is very simplistic.  We view a neural.
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
Reservoir Uncertainty Assessment Using Machine Learning Techniques Authors: Jincong He Department of Energy Resources Engineering AbstractIntroduction.
Decision Tree Learning Presented by Ping Zhang Nov. 26th, 2007.
Each neuron has a threshold value Each neuron has weighted inputs from other neurons The input signals form a weighted sum If the activation level exceeds.
C - IT Acumens. COMIT Acumens. COM. To demonstrate the use of Neural Networks in the field of Character and Pattern Recognition by simulating a neural.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Outline Decision tree representation ID3 learning algorithm Entropy, Information gain Issues in decision tree learning 2.
Artificial Neural Network System to Predict Golf Score on the PGA Tour ECE 539 – Fall 2003 Final Project Robert Steffes ID:
Feasibility of Using Machine Learning Algorithms to Determine Future Price Points of Stocks By: Alexander Dumont.
Chapter 11 – Neural Nets © Galit Shmueli and Peter Bruce 2010 Data Mining for Business Intelligence Shmueli, Patel & Bruce.
A Document-Level Sentiment Analysis Approach Using Artificial Neural Network and Sentiment Lexicons Yan Zhu.
Handwritten Digit Recognition Using Stacked Autoencoders
ECE 5424: Introduction to Machine Learning
DeepCount Mark Lenson.
Prepared by: Mahmoud Rafeek Al-Farra
Learning from Data.
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
Issues in Decision-Tree Learning Avoiding overfitting through pruning
Schizophrenia Classification Using
© 2013 ExcelR Solutions. All Rights Reserved An Introduction to Creating a Perfect Decision Tree.
Construct a Convolutional Neural Network with Python
network of simple neuron-like computing elements
Neural Networks Geoff Hulten.
Neural Networks ICS 273A UC Irvine Instructor: Max Welling
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
Classification Boundaries
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
Artificial Intelligence 10. Neural Networks
Evolutionary Ensembles with Negative Correlation Learning
Neural Networks Weka Lab
A Data Partitioning Scheme for Spatial Regression
Presentation transcript:

ECE/CS/ME 539 Artificial Neural Networks Final Project

A Comparison of a Learning Decision Tree and a 2-Layer Back-Propagation Neural Network on classifying a car purchase using a 2-Layer Back-Propagation Neural Network constructed in Java Steve Ludwig 12-19-03

Introduction/Motivation Studied Decision Learning Trees Same purpose as pattern classifying BP Neural Nets Wanted to compare/contrast using identical data Built own 2-layer back-propagation neural network in Java with customizable attributes

Data Learning Tree uses text-based attributes/values Constructs ‘tree’ with nodes as attributes Leaf nodes classify as positive or negative Had to convert to numeric values for BP Neural Net e.g. acceptable case = 1, unacceptable case = 0 Could customize Neural Net parameters Tried different learning rates, epochs, permutation of train set (to avoid overfitting)

Results Both Neural Net and Learning Tree had almost identical test set classification rates Learning Tree = 95.789 % BP Neural Net = 95.105 % Learning Tree runs much faster, always consistent Neural Net only consistent when train set not permutated

Conclusions Learning Tree works faster, great accuracy, can use text-based attributes BP Neural Net has more flexibility, can be modified to work better (more hidden layers), still good classification rate