Sanguthevar Rajasekaran University of Connecticut

Slides:



Advertisements
Similar presentations
Slides from: Doug Gray, David Poole
Advertisements

Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Deep Learning Bing-Chen Tsai 1/21.
Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
Data Mining Classification: Alternative Techniques
Classification Neural Networks 1
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
November 19, 2009Introduction to Cognitive Science Lecture 20: Artificial Neural Networks I 1 Artificial Neural Network (ANN) Paradigms Overview: The Backpropagation.
Slide 1 EE3J2 Data Mining EE3J2 Data Mining Lecture 15: Introduction to Artificial Neural Networks Martin Russell.
Neural Networks. R & G Chapter Feed-Forward Neural Networks otherwise known as The Multi-layer Perceptron or The Back-Propagation Neural Network.
Machine Learning Motivation for machine learning How to set up a problem How to design a learner Introduce one class of learners (ANN) –Perceptrons –Feed-forward.
Artificial Neural Networks
Artificial Neural Networks
Artificial Neural Networks
1 st Neural Network: AND function Threshold(Y) = 2 X1 Y X Y.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Artificial Neural Networks
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Multi-Layer Perceptrons Michael J. Watts
Introduction to Neural Networks Debrup Chakraborty Pattern Recognition and Machine Learning 2006.
Explorations in Neural Networks Tianhui Cai Period 3.
Neural Networks Ellen Walker Hiram College. Connectionist Architectures Characterized by (Rich & Knight) –Large number of very simple neuron-like processing.
ANNs (Artificial Neural Networks). THE PERCEPTRON.
Machine Learning Chapter 4. Artificial Neural Networks
NEURAL NETWORKS FOR DATA MINING
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 16: NEURAL NETWORKS Objectives: Feedforward.
Artificial Intelligence Lecture No. 29 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
So Far……  Clustering basics, necessity for clustering, Usage in various fields : engineering and industrial fields  Properties : hierarchical, flat,
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 31: Feedforward N/W; sigmoid.
Multi-Layer Perceptron
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
CS 478 – Tools for Machine Learning and Data Mining Perceptron.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Chapter 8: Adaptive Networks
Hazırlayan NEURAL NETWORKS Backpropagation Network PROF. DR. YUSUF OYSAL.
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Perceptrons Michael J. Watts
Artificial Intelligence Methods Neural Networks Lecture 3 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
1 Neural networks 2. 2 Introduction: Neural networks The nervous system contains 10^12 interconnected neurons.
Lecture 12. Outline of Rule-Based Classification 1. Overview of ANN 2. Basic Feedforward ANN 3. Linear Perceptron Algorithm 4. Nonlinear and Multilayer.
Artificial Neural Networks By: Steve Kidos. Outline Artificial Neural Networks: An Introduction Frank Rosenblatt’s Perceptron Multi-layer Perceptron Dot.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
Neural networks.
Artificial Neural Networks
Supervised Learning in ANNs
Fall 2004 Perceptron CS478 - Machine Learning.
CSE 4705 Artificial Intelligence
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
LECTURE 28: NEURAL NETWORKS
Neural Networks A neural network is a network of simulated neurons that can be used to recognize instances of patterns. NNs learn by searching through.
Artificial Neural Networks
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
Machine Learning Today: Reading: Maria Florina Balcan
Biological and Artificial Neuron
Classification Neural Networks 1
Biological and Artificial Neuron
Artificial Intelligence Methods
Biological and Artificial Neuron
Neural Networks Chapter 5
Artificial Intelligence Lecture No. 28
Capabilities of Threshold Neurons
Lecture Notes for Chapter 4 Artificial Neural Networks
LECTURE 28: NEURAL NETWORKS
Seminar on Machine Learning Rada Mihalcea
Introduction to Neural Network
CSC 578 Neural Networks and Deep Learning
Presentation transcript:

Sanguthevar Rajasekaran University of Connecticut Machine Learning Sanguthevar Rajasekaran University of Connecticut

Machine Learning

Machine Learning

Machine Learning In practice we may not know the form of the function and also there could be errors. In practice we guess the form of the function. Each such possible function will be called a model and characterized by some parameters.   We choose parameter values that will minimize the difference between the model outputs & the true function values. There are two kinds of learning: supervised & unsupervised. In supervised learning we are given a set of examples (x, y) and the goal is to infer the predicting conditional probability distribution P(y| x). In unsupervised learning the goal is to predict a data generating distribution P(x) after observing many random vectors x from this distribution.

Machine Learning A machine learning algorithm will also be tested on training as well as previously unseen on data points. Thus we have training error and testing error. A model is said to underfit if its training error is not low enough. A model is said to overfit if the difference between the training and test error is very large. In this case the model memorizes the properties of the training data closely. We can modify the underfitting and overfitting behavior of a learning algorithm by changing the capacity of the model.

Neural Networks 1 Neural networks are learning paradigms inspired by the human brain. Brain consists of millions of neurons interconnected by a complex network. Even though each neuron is limited in power, the collection can produce impressive results.

Neural Networks 2 A neural network is a connected leveled graph. Each node is a processor. (Directed) edges correspond to communication links. In this leveled graph there will be at least two levels (one for input and another for output). There could be more levels (called hidden levels).

Neural Networks 3 Each edge in the network has an associated weight and each node has a threshold. Consider a two-level network. Let N be any node in the second level (i.e., the output level). Let q be the number of incoming edges into N and let x1, x2, …, xq be the corresponding input values. Let the weights on these incoming edges be w1, w2, …, wq. Denote the threshold value of N as T. Consider the case where each output is binary. If w1.x1+w2.x2+…+wq.xq>T, then the node N will output one possible value; if not, it will output the other possible value.

Neural Networks 4

Neural Networks 5 There are patterns that a perceptron cannot learn. For example there is no perceptron corresponding to the XOR function. (XOR is a binary function defined as follows. If x and y are Boolean, then x XOR y is 1 if and only if exactly one of x and y is one; x XOR y is zero otherwise).

Neural Networks 6 For appropriate values of the node thresholds and edge weights a neural network can represent an arbitrarily complex function or concept. A neural network can be used to learn a given concept by training it with a number of examples. An example is nothing but a pair of input and the corresponding output values.

Neural Networks 7 A General ANN looks like:

Neural Networks 8

Neural Networks 9

Neural Networks 10

Neural Networks 11