1 MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING By Kaan Tariman M.S. in Computer Science CSCI 8810 Course Project.

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

Slides from: Doug Gray, David Poole
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
CPSC 502, Lecture 15Slide 1 Introduction to Artificial Intelligence (AI) Computer Science cpsc502, Lecture 15 Nov, 1, 2011 Slide credit: C. Conati, S.
Data Mining Classification: Alternative Techniques
Data Mining Classification: Alternative Techniques
Data Mining Classification: Alternative Techniques
Intelligent Environments1 Computer Science and Engineering University of Texas at Arlington.
Indian Statistical Institute Kolkata
Artificial Neural Networks
Machine Learning Neural Networks
Multiple Criteria for Evaluating Land Cover Classification Algorithms Summary of a paper by R.S. DeFries and Jonathan Cheung-Wai Chan April, 2000 Remote.
Brian Merrick CS498 Seminar.  Introduction to Neural Networks  Types of Neural Networks  Neural Networks with Pattern Recognition  Applications.
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
Learning from Observations Chapter 18 Section 1 – 4.
Lesson 8: Machine Learning (and the Legionella as a case study) Biological Sequences Analysis, MTA.
Lazy Learning k-Nearest Neighbour Motivation: availability of large amounts of processing power improves our ability to tune k-NN classifiers.
Data Mining with Decision Trees Lutz Hamel Dept. of Computer Science and Statistics University of Rhode Island.
Machine Learning Motivation for machine learning How to set up a problem How to design a learner Introduce one class of learners (ANN) –Perceptrons –Feed-forward.
Classification and Prediction by Yen-Hsien Lee Department of Information Management College of Management National Sun Yat-Sen University March 4, 2003.
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Chapter 5 Data mining : A Closer Look.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Extracting Places and Activities from GPS Traces Using Hierarchical Conditional Random Fields Yong-Joong Kim Dept. of Computer Science Yonsei.
A Genetic Algorithms Approach to Feature Subset Selection Problem by Hasan Doğu TAŞKIRAN CS 550 – Machine Learning Workshop Department of Computer Engineering.
Data Mining Joyeeta Dutta-Moscato July 10, Wherever we have large amounts of data, we have the need for building systems capable of learning information.
Short Introduction to Machine Learning Instructor: Rada Mihalcea.
Mohammad Ali Keyvanrad
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
We introduce the use of Confidence c as a weighted vote for the voting machine to avoid low confidence Result r of individual expert from affecting the.
1 Artificial Neural Networks Sanun Srisuk EECP0720 Expert Systems – Artificial Neural Networks.
Machine Learning in Spoken Language Processing Lecture 21 Spoken Language Processing Prof. Andrew Rosenberg.
COMMON EVALUATION FINAL PROJECT Vira Oleksyuk ECE 8110: Introduction to machine Learning and Pattern Recognition.
NEURAL NETWORKS FOR DATA MINING
Introduction to machine learning and data mining 1 iCSC2014, Juan López González, University of Oviedo Introduction to machine learning Juan López González.
1 Machine Learning 1.Where does machine learning fit in computer science? 2.What is machine learning? 3.Where can machine learning be applied? 4.Should.
1 Pattern Recognition Pattern recognition is: 1. A research area in which patterns in data are found, recognized, discovered, …whatever. 2. A catchall.
Learning from observations
Breast Cancer Diagnosis via Neural Network Classification Jing Jiang May 10, 2000.
School of Engineering and Computer Science Victoria University of Wellington Copyright: Peter Andreae, VUW Image Recognition COMP # 18.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern.
A NOVEL METHOD FOR COLOR FACE RECOGNITION USING KNN CLASSIFIER
Classification (slides adapted from Rob Schapire) Eran Segal Weizmann Institute.
Chapter1: Introduction Chapter2: Overview of Supervised Learning
Random Forests Ujjwol Subedi. Introduction What is Random Tree? ◦ Is a tree constructed randomly from a set of possible trees having K random features.
1 Learning Bias & Clustering Louis Oliphant CS based on slides by Burr H. Settles.
Computer Vision Lecture 7 Classifiers. Computer Vision, Lecture 6 Oleh Tretiak © 2005Slide 1 This Lecture Bayesian decision theory (22.1, 22.2) –General.
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
DATA MINING TECHNIQUES (DECISION TREES ) Presented by: Shweta Ghate MIT College OF Engineering.
Lecture 2 Introduction to Neural Networks and Fuzzy Logic President UniversityErwin SitompulNNFL 2/1 Dr.-Ing. Erwin Sitompul President University
Network Management Lecture 13. MACHINE LEARNING TECHNIQUES 2 Dr. Atiq Ahmed Université de Balouchistan.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
Prepared by: Mahmoud Rafeek Al-Farra
CSSE463: Image Recognition Day 11
Data Mining Lecture 11.
Prepared by: Mahmoud Rafeek Al-Farra
Perceptron as one Type of Linear Discriminants
Overview of Machine Learning
network of simple neuron-like computing elements
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
CSSE463: Image Recognition Day 11
A task of induction to find patterns
A task of induction to find patterns
Random Neural Network Texture Model
Presentation transcript:

1 MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING By Kaan Tariman M.S. in Computer Science CSCI 8810 Course Project

2 Outline  Introduction to Machine Learning  The example application  Machine Learning Methods Decision Trees Artificial Neural Networks Instant Based Learning

3 What is Machine Learning  Machine Learning (ML) is constructing computer programs that develop solutions and improve with experience  Solves problems which can not be solved by enumerative methods or calculus-based techniques  Intuition is to model human way of solving some problems which require experience  When the relationships between all system variables is completely understood ML is not needed

4 A Generic System System … … Input Variables: Hidden Variables: Output Variables:

5 Learning Task  Face recognition problem: Whose face is this in the picture?  Hard to model describing face and its components  Humans recognize with experience: The more we see the faster we perceive.

6 The example  Vision module for Sony Aibo Robots that we have developed for Legged Robot Tournament in RoboCup  Output of the module isdistance and orientation of the target objects: the ball, the players the goals the beacons - used for localization of the robot.

7 Aibo’s View

8 Main ML Methods  Decision Trees  Artificial Neural Networks (ANN)  Instant-Based Learning  Bayesian Methods  Reinforcement Learning  Inductive Logic Programming (ILP)  Genetic Algorithms (GA)  Support Vector Machines (SVM)

9 Decision Trees  Approximation of discrete functions by a decision tree.  In the nodes of trees are attributes and in the leaves are values of discrete function  Ex: A decision tree for “play tennis”

10 Algorithm to derive a tree  Until each leaf node is populated by as homogeneous a sample set as possible: Select a leaf node with an inhomogeneous sample set. Replace that leaf node by a test node that divides the inhomogeneous sample set into minimally inhomogeneous subsets, according to an entropy calculation.

11 Color Classification  Data set includes pixel values labeled with different colors manually  The tree classifies a pixel to a color according to its Y,U,V values.  Adaptable for different conditions.

12 How do we construct the data set? 1) Open an image taken by the robot

13 How do we construct the data set? 2) Label the pixels with colors [Y,U,V,color] entries are created for each pixel labeled

14 How do we construct the data set? 3) Use the ML method and display results

15 The decision tree output  The data set is divided into training and validation set  After training the tree is evaluated with validation set.  Training should be done carefully, avoiding bias.

16 Artificial Neural Networks (ANN)  Made up of interconnected processing elements which respond in parallel to a set of input signals given to each

17 ANN Algorithm  Training algorithm adjusts the weights reducing the error between the known output values and the actual values  At first, the outputs are arbitrary.  As cases are reintroduced repeatedly ANN gives more right answers.  Test set is used to stop training.  ANN is validated with unseen data (validation set)

18 ANN output for our example

19 Face Recognition with ANN  Problem: Orientation of face  Input nodes are pixel values of the image. (32 x 32)  Output has 4 nodes (right, left, up, straight)  6 hidden nodes

20 Face Recognition with ANN  Hidden nodes normally does not infer anything, in this case we can observe some behavior.

21 Instance Based Learning  A learn-by-memorizing method: K-Nearest Neighbor  Given a data set {X i, Y i } it estimates values of Y for X's other than those in the sample.  The process is to choose the k values of X i nearest the X and average their Y values.  Here k is a parameter to the estimator. The average could be weighted, e.g. with the closest neighbor having the most impact on the estimate.

22 KNN facts  Database of knowledge about known instances is required – memory complexity  “Lazy learning”, no model for the hypothesis  Ex: Color classification  A voting method is applied in order to output a color class for the pixel.

23 Summary  Machine Learning is an interdisciplinary field involving programs that improve by experience  ML is good for pattern recognition, object extraction and color classification etc. problems in image processing problem domain.  3 methods are considered: Decision Trees Artificial Neural Networks Instant Based Learning

24 Thank you!