Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Friday, February 4, 2000 Lijun.

Slides:



Advertisements
Similar presentations
Explanation-Based Learning (borrowed from mooney et al)
Advertisements

Analytical Learning.
Combining Inductive and Analytical Learning
Artificial Intelligence 12. Two Layer ANNs
 2002, G.Tecuci, Learning Agents Laboratory 1 Learning Agents Laboratory Computer Science Department George Mason University Prof. Gheorghe Tecuci Deductive.
Perceptron Learning Rule
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Knowledge Representation and Reasoning Learning Sets of Rules and Analytical Learning Harris Georgiou – 4.
Combining Inductive and Analytical Learning Ch 12. in Machine Learning Tom M. Mitchell 고려대학교 자연어처리 연구실 한 경 수
Induction and Decision Trees. Artificial Intelligence The design and development of computer systems that exhibit intelligent behavior. What is intelligence?
Artificial Intelligence Lecture 2 Dr. Bo Yuan, Professor Department of Computer Science and Engineering Shanghai Jiaotong University
Machine Learning Neural Networks
CII504 Intelligent Engine © 2005 Irfan Subakti Department of Informatics Institute Technology of Sepuluh Nopember Surabaya - Indonesia.
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
Data Mining Techniques Outline
Relational Data Mining in Finance Haonan Zhang CFWin /04/2003.
Ensemble Learning: An Introduction
Semi-Supervised Clustering Jieping Ye Department of Computer Science and Engineering Arizona State University
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Extracting Places and Activities from GPS Traces Using Hierarchical Conditional Random Fields Yong-Joong Kim Dept. of Computer Science Yonsei.
Inductive Logic Programming Includes slides by Luis Tari CS7741L16ILP.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Monday, April 3, 2000 DingBing.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Artificial Intelligence Lecture No. 28 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
1 Machine Learning: Lecture 11 Analytical Learning / Explanation-Based Learning (Based on Chapter 11 of Mitchell, T., Machine Learning, 1997)
For Friday Read chapter 18, sections 3-4 Homework: –Chapter 14, exercise 12 a, b, d.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Wednesday, February 7, 2001.
Using Neural Networks in Database Mining Tino Jimenez CS157B MW 9-10:15 February 19, 2009.
1 CSI 5388:Topics in Machine Learning Inductive Learning: A Review.
Theory Revision Chris Murphy. The Problem Sometimes we: – Have theories for existing data that do not match new data – Do not want to repeat learning.
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
1 Machine Learning The Perceptron. 2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)
Introduction to machine learning and data mining 1 iCSC2014, Juan López González, University of Oviedo Introduction to machine learning Juan López González.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 21 Oct 28, 2005 Nanjing University of Science & Technology.
Chapter 8 The k-Means Algorithm and Genetic Algorithm.
Benk Erika Kelemen Zsolt
110/19/2015CS360 AI & Robotics AI Application Areas  Neural Networks and Genetic Algorithms  These model the structure of neurons in the brain  Humans.
CS 445/545 Machine Learning Winter, 2012 Course overview: –Instructor Melanie Mitchell –Textbook Machine Learning: An Algorithmic Approach by Stephen Marsland.
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Friday, 16 February 2007 William.
Computing & Information Sciences Kansas State University Paper Review Guidelines KDD Lab Course Supplement William H. Hsu Kansas State University Department.
Machine Learning Chapter 2. Concept Learning and The General-to-specific Ordering Tom M. Mitchell.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Monday, January 22, 2001 William.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Wednesday, March 29, 2000.
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Externally growing self-organizing maps and its application to database visualization and exploration.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Wednesday, February 2, 2000.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 9 of 42 Wednesday, 14.
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
For Monday Finish chapter 19 Take-home exam due. Program 4 Any questions?
CpSc 810: Machine Learning Analytical learning. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Monday, January 24, 2000 William.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Emotion Recognition from Text Using Situational Information and a Personalized Emotion Model Yong-soo Seol 1, Han-woo Kim 1, and Dong-joo Kim 2 1 Department.
Data Mining and Decision Support
CS 5751 Machine Learning Chapter 12 Comb. Inductive/Analytical 1 Combining Inductive and Analytical Learning Why combine inductive and analytical learning?
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Organic Evolution and Problem Solving Je-Gun Joung.
Kansas State University Department of Computing and Information Sciences CIS 890: Special Topics in Intelligent Systems Wednesday, November 15, 2000 Cecil.
Pattern Recognition. What is Pattern Recognition? Pattern recognition is a sub-topic of machine learning. PR is the science that concerns the description.
Analyzing Promoter Sequences with Multilayer Perceptrons Glenn Walker ECE 539.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Wednesday, January 26, 2000.
Deep Learning Overview Sources: workshop-tutorial-final.pdf
FNA/Spring CENG 562 – Machine Learning. FNA/Spring Contact information Instructor: Dr. Ferda N. Alpaslan
CS 9633 Machine Learning Concept Learning
CS 9633 Machine Learning Inductive-Analytical Methods
Spring 2003 Dr. Susan Bridges
Prof. Carolina Ruiz Department of Computer Science
Artificial Intelligence 12. Two Layer ANNs
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Friday, February 4, 2000 Lijun Wang Department of Computing and Information Sciences, KSU Analytical Learning Discussion (4 of 4): Refinement of Approximate Domain Theories by Knowledge-Based Neural Networks Lecture 8

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Presentation Outline Paper –“Refinement of Approximate Domain Theories by Knowledge-Based Neural Networks” –Authors: Geoffrey G. Towell, Jude W. Shavlik, Michiel O. Noordewier –Appears in the Proceedings of the Eighth National Conference on AI Overview –Use Horn clauses domain theory to create an equivalent artificial neural network(ANN) KBANN algorithm Empirical testing in molecular biology Extension Research of KBANN Application to Knowledge Discovery in Database: Issues –Combined inductive and analytical learning –Key strengths: better than random initial weight? Lead to better generalization accuracy for the final hypothesis? –Key weakness: restricted to non-recursive, prepositional domain theories

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence KBANN Algorithm The KBANN assumes a domain theory can be represented by an ANN –Definition of ANN An artificial neural network is composed of a number of nodes, or units, connected by links. Each links has a numeric weight associated with it. Learning takes place by updating the weights. –Given A set of training examples A domain theory consisting of nonrecursive, prepositional Horn clauses –Determine An artificial neural network that fits the training examples, biased by the domain theory –the knowledge base is translated into ANN

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Translation of rules –sets weights on links and biases of units so that units have significant activation only when the corresponding deduction could be made using the domain knowledge –Explanations for each mandatory antecedent, assign a weight: w for each prohibitory antecedent, assign a weight: -w bias on the unit: n  w -  for conjunction w -  for disjunction Algorithm specification –overview Translate rules to set initial network structure Add units not specified by translation Add links not specified by translation perturb the network by adding near zero random numbers to all link weights and biases KBANN Algorithm(continue)

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence KBANN Algorithm Examples A: -B, C. B: - not F, G. B: not H. C: -I, J. A KJI H G F C B FGHIJK A 3.7 An Illustrative Example (Translation of a Knowledge Base into an ANN) Y -2.3 C 3.7 X 0.7 B 0.7 (b)(c)(a) Domain theory Hierarchical structure necessary dependence prohibitory dependence ANN representation w = 3 w = -3 refinement links  units X, Y introduced to handle disjunction

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence KBANN Algorithm Cup learning task ( from Machine Learning by Tom Mitchell) Expensive BottomIsFlat MadeOfCeramic MadeOfStyrofoam MadeOfPaper HasHandle HandleOnTop HandleOnSide Light HasConcavity ConcavityPointsUp Fragile Cup Liftable OpenVessel Stable Graspable Neural Network BottomIsFlat ConcavityPointsUp Expensive Fragile HandleOnTop HandleOnSide HasConcavity HasHandle Light MadeOfCeramic MadeOfPaper MadeOfStyrofoam Training Examples Cup <--- Stable, Liftable, OpenVessel Stable <--- BottomIsFlat Liftable <---- Graspable, Light Graspable <-- HasHandle OpenVessel <--- HasConcavity, ConcavityPointsUp Domain theory

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Molecular genetics experiment using KBANN –Task learn to recognize DNA segments called promoter regions which influence gene activity –Domain theory a promoter involves two subcategories: a contact and a conformation region contact involves two regions rules defining region characteristics example: “aaxxt” –ANN Experimenting with KBANN Experimenting with KBANN Minus_35 conformation contact Minus_10 DNA sequence promoter

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Molecular genetics problem using KBANN(continue) –procedure 53 positive and 53 negative training examples N = 106 “leave-one-out” method, on each iteration KBANN was trained using 105 of the 106 examples and tested on the remaining example –results Experimenting with KBANN(continue)

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Related Work Problems specific to Neural Networks –Topology determination(restricted to a single layer of hidden units or random setting of link weights) –Integration of existing information into the network( how to use background information or improve incorrect domain theories in ANNs ) KBANN solutions –Connect the inputs of network units to the attributes tested by the clause antecedents, assign a weight of w to the unit for each positive antecedent or -w for each negative antecedent –initialize the hypothesis to perfectly fit the domain theory, then inductively refine the initial hypothesis as needed to fit the training data

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Summary Points Content Critique –Key contribution: analytically creates a network equivalent to the given domain theory inductively refines the initial hypothesis to better fit the training data in doing so, it modifies the network weights to overcome the inconsistencies between the domain theory and the observed data. –Strengths Generalize more accurately given an approximately correct domain theory Outperform other purely inductive methods when data is scarce Domain theory used in KBANN indicates important features to an example classification Derived features are also specified through deduction, therefore reducing the complexity of an ANN’ final decision

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence – Weaknesses Is restricted to non-recursive, prepositional(i.e.. Variable-free) Horn clauses May be misled given highly inaccurate domain theory Is problematic to extract information from ANNs after learning because some weight settings have no direct Horn clause analog. Blackbox method, which provide good results without explanation Presentation Critique – Audience: AI (learning, planning), ANN, applied logic researchers – Positive and exemplary points Clear example illustrating the translation of knowledge base into an ANN Good experimental results over other inductive learning algorithm – Negative points and possible improvements we understand some basic ideas of ANN translation, but still may not be able to do it Summary Points(continue)

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Questions, Comments