Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Friday, May 5, 2000 William.

Slides:



Advertisements
Similar presentations
1 Probability and the Web Ken Baclawski Northeastern University VIStology, Inc.
Advertisements

Pat Langley Computational Learning Laboratory Center for the Study of Language and Information Stanford University, Stanford, California
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Prof. Carolina Ruiz Computer Science Department Bioinformatics and Computational Biology Program WPI WELCOME TO BCB4003/CS4803 BCB503/CS583 BIOLOGICAL.
Combining Inductive and Analytical Learning Ch 12. in Machine Learning Tom M. Mitchell 고려대학교 자연어처리 연구실 한 경 수
CII504 Intelligent Engine © 2005 Irfan Subakti Department of Informatics Institute Technology of Sepuluh Nopember Surabaya - Indonesia.
Learning From Data Chichang Jou Tamkang University.
1 Hybrid Agent-Based Modeling: Architectures,Analyses and Applications (Stage One) Li, Hailin.
Chapter 12: Intelligent Systems in Business
Learning Programs Danielle and Joseph Bennett (and Lorelei) 4 December 2007.
Statistical Learning: Pattern Classification, Prediction, and Control Peter Bartlett August 2002, UC Berkeley CIS.
Data Mining – Intro.
Computer Science Universiteit Maastricht Institute for Knowledge and Agent Technology Data mining and the knowledge discovery process Summer Course 2005.
Radial Basis Function Networks
CS Machine Learning. What is Machine Learning? Adapt to / learn from data  To optimize a performance function Can be used to:  Extract knowledge.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence From Data Mining To Knowledge.
CS598CXZ Course Summary ChengXiang Zhai Department of Computer Science University of Illinois, Urbana-Champaign.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Wednesday, January 19, 2001.
Last Words COSC Big Data (frameworks and environments to analyze big datasets) has become a hot topic; it is a mixture of data analysis, data mining,
Copyright R. Weber Machine Learning, Data Mining ISYS370 Dr. R. Weber.
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Tuesday 15 October 2002 William.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 26 of 41 Friday, 22 October.
Computing & Information Sciences Kansas State University Friday, 21 Nov 2008CIS 530 / 730: Artificial Intelligence Lecture 35 of 42 Friday, 21 November.
General Information Course Id: COSC6342 Machine Learning Time: TU/TH 10a-11:30a Instructor: Christoph F. Eick Classroom:AH123
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Wednesday, February 7, 2001.
Project MLExAI Machine Learning Experiences in AI Ingrid Russell, University.
CPSC 502, Lecture 15Slide 1 Introduction to Artificial Intelligence (AI) Computer Science cpsc502, Lecture 16 Nov, 3, 2011 Slide credit: C. Conati, S.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Wednesday, January 19, 2000.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Friday, February 4, 2000 Lijun.
Lecture 10: 8/6/1435 Machine Learning Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
Machine Learning Using Support Vector Machines (Paper Review) Presented to: Prof. Dr. Mohamed Batouche Prepared By: Asma B. Al-Saleh Amani A. Al-Ajlan.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 25 Wednesday, 20 October.
Machine Learning.
Introduction to Artificial Intelligence and Soft Computing
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Friday, 16 February 2007 William.
Computing & Information Sciences Kansas State University Wednesday, 22 Oct 2008CIS 530 / 730: Artificial Intelligence Lecture 22 of 42 Wednesday, 22 October.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 28 of 41 Friday, 22 October.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Friday, March 10, 2000 William.
Kansas State University Department of Computing and Information Sciences CIS 690: Implementation of High-Performance Data Mining Systems Friday, 23 May.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 25 of 41 Monday, 25 October.
Data Mining – Intro. Course Overview Spatial Databases Temporal and Spatio-Temporal Databases Multimedia Databases Data Mining.
Last Words DM 1. Mining Data Steams / Incremental Data Mining / Mining sensor data (e.g. modify a decision tree assuming that new examples arrive continuously,
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Wednesday, March 29, 2000.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Wednesday, January 24, 2001.
Kansas State University Department of Computing and Information Sciences CIS 798: Intelligent Systems and Machine Learning Tuesday, December 7, 1999 William.
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Thursday 29 October 2002 William.
Machine Learning Extract from various presentations: University of Nebraska, Scott, Freund, Domingo, Hong,
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Wednesday, February 2, 2000.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 9 of 42 Wednesday, 14.
Learning from Positive and Unlabeled Examples Investigator: Bing Liu, Computer Science Prime Grant Support: National Science Foundation Problem Statement.
CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Friday, 14 November 2003 William.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Monday, January 24, 2000 William.
Data Mining and Decision Support
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 24 of 41 Monday, 18 October.
Computing & Information Sciences Kansas State University Wednesday, 04 Oct 2006CIS 490 / 730: Artificial Intelligence Lecture 17 of 42 Wednesday, 04 October.
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Wednesday, January 26, 2000.
Computing & Information Sciences Kansas State University Friday, 13 Oct 2006CIS 490 / 730: Artificial Intelligence Lecture 21 of 42 Friday, 13 October.
FNA/Spring CENG 562 – Machine Learning. FNA/Spring Contact information Instructor: Dr. Ferda N. Alpaslan
Network Management Lecture 13. MACHINE LEARNING TECHNIQUES 2 Dr. Atiq Ahmed Université de Balouchistan.
Brief Intro to Machine Learning CS539
Data Mining – Intro.
School of Computer Science & Engineering
Basic Intro Tutorial on Machine Learning and Data Mining
Data Warehousing and Data Mining
Overview of Machine Learning
A task of induction to find patterns
Chapter 12 Analyzing Semistructured Decision Support Systems
Presentation transcript:

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Friday, May 5, 2000 William H. Hsu Department of Computing and Information Sciences, KSU Readings: Chapters 1-10, 13, Mitchell Chapters 14-21, Russell and Norvig Course Review and Future Research Directions Lecture 45

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Main Themes Artificial Intelligence and KDD Analytical Learning: Combining Symbolic and Numerical AI –Inductive learning –Role of knowledge and deduction in integrated inductive and analytical learning Artificial Neural Networks (ANNs) for KDD –Common neural representations: current limitations –Incorporating knowledge into ANN learning Uncertain Reasoning in Decision Support –Probabilistic knowledge representation –Bayesian knowledge and data engineering (KDE): elicitation, causality Data mining: KDD applications –Role of causality and explanations in KDD –Framework for data mining: wrappers for performance enhancement Genetic Algorithms (GAs) for KDD –Evolutionary algorithms (GAs, GP) as optimization wrappers –Introduction to classifier systems

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Class 0: A Brief Overview of Machine Learning Overview: Topics, Applications, Motivation Learning = Improving with Experience at Some Task –Improve over task T, –with respect to performance measure P, –based on experience E. Brief Tour of Machine Learning –A case study –A taxonomy of learning –Intelligent systems engineering: specification of learning problems Issues in Machine Learning –Design choices –The performance element: intelligent systems Some Applications of Learning –Database mining, reasoning (inference/decision support), acting –Industrial usage of intelligent systems

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Class 1: Integrating Analytical and Inductive Learning Learning Specification (Inductive, Analytical) –Instances X, target function (concept) c: X  H, hypothesis space H –Training examples D: positive, negative examples of target function c –Analytical learning: also given domain theory T for explaining examples Domain Theories –Expressed in formal language: propositional logic, predicate logic –Set of assertions (e.g., well-formed formulae) for reasoning about domain Expresses constraints over relations (predicates) within model Example: Ancestor (x, y)  Parent (x, z)  Ancestor (z, y). Determine –Hypothesis h  H such that h(x) = c(x) for all x  D –Such h are consistent with training data and domain theory T Integration Approaches –Explanation (proof and derivation)-based learning: EBL –Pseudo-experience: incorporating knowledge of environment, actuators –Top-down decomposition: programmatic (procedural) knowledge, advice

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Classes 2-3: Explanation-Based Neural Networks Paper –Topic: Explanation-Based and Inductive Learning in ANNs –Title: Integrating Inductive Neural Network Learning and EBL –Authors: Thrun and Mitchell –Presenter: William Hsu Key Strengths –Idea: (state, action)-to-state mappings as steps in generalizable proof (explanation) for observed episode –Generalizable approach (significant for RL, other learning-to-predict inducers) Key Weaknesses –Other numerical learning models (HMMs, DBNs) may be more suited to EBG –Tradeoff: domain theory of EBNN lacks semantic clarity of symbolic EBL Future Research Issues –How to get the best of both worlds (clear DT, ability to generate explanations)? –Applications: to explanation in commercial, military, legal decision support –See work by: Thrun, Mitchell, Shavlik, Towell, Pearl, Heckerman

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Classes 4-5: Phantom Induction Paper –Topic: Distal Supervised Learning and Phantom Induction –Title: Iterated Phantom Induction: a Little Knowledge Can Go a Long Way –Authors: Brodie and Dejong –Presenter: Steve Gustafson Key Strengths –Idea: apply knowledge to generate (pseudo-experiential) training data –Speedup – learning curve significantly shortened with respect to RL by application of “small amount” of knowledge Key Weaknesses –Haven’t yet seen how to produce plausible, comprehensible explanations –How much knowledge is “a small amount”? (How to measure?) Future Research Issues –Control, planning domains similar (but not identical) to robot games –Applications: adaptive (e.g., ANN, BBN, MDP, GA) agent control, planning –See work by: Brodie, Dejong, Rumelhart, McClelland, Sutton, Barto

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Classes 6-7: Top-Down Hybrid Learning Paper –Topic: Learning with Prior Knowledge –Title: A Divide-and-Conquer Approach to Learning from Prior Knowledge –Authors: Chown and Dietterich –Presenter: Aiming Wu Key Strengths –Idea: apply programmatic (procedural) knowledge to select training data –Uses simulation to boost inductive learning performance (cf. model checking) –Divide-and-conquer approach (multiple experts) Key Weaknesses –Doesn’t illustrate form, structure of programmatic knowledge clearly –Doesn’t systematize and formalize model checking / simulation approach Future Research Issues –Model checking and simulation-driven hybrid learning –Applications: “consensus under uncertainty”, simulation-based optimization –See work by: Dietterich, Frawley, Mitchell, Darwiche, Pearl

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Classes 8-9: Learning Using Prior Knowledge Paper –Topic: Refinement of Approximate Domain-Theoretic Knowledge –Title: Refinement of Approximate Domain Theories by Knowledge-Based Neural Networks –Authors: Towell, Shavlik, and Noordewier –Presenter: Li-Jun Wang Key Strengths –Idea: build relational explanations; compile into ANN representation –Applies structural, functional, constraint-based knowledge –Uses ANN to further refine domain theory Key Weaknesses –Can’t get refined domain theory back! –Explanations also no longer clear after “compilation” (transformation) process Future Research Issues –How to retain semantic clarity of explanations, DT, knowledge representation –Applications: intelligent filters (e.g., fraud detection), decision support –See work by: Shavlik, Towell, Maclin, Sun, Schwalb, Heckerman

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Class 10: Introduction to Artificial Neural Networks Architectures –Nonlinear transfer functions –Multi-layer networks of nonlinear units (sigmoid, hyperbolic tangent) –Hidden layer representations Backpropagation of Error –The backpropagation algorithm Relation to error gradient function for nonlinear units Derivation of training rule for feedfoward multi-layer networks –Training issues: local optima, overfitting References: Chapter 4, Mitchell; Chapter 4, Bishop; Rumelhart et al Research Issues: How to… –Learn from observation, rewards and penalties, and advice –Distribute rewards and penalties through learning model, over time –Generate pseudo-experiential training instances in pattern recognition –Partition learning problems on the fly, via (mixture) parameter estimation

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Classes 11-12: Reinforcement Learning and Advice Paper –Topic: Knowledge and Reinforcement Learning in Intelligent Agents –Title: Incorporating Advice into Agents that Learn from Reinforcements –Authors: Maclin and Shavlik –Presenter: Kiranmai Nandivada Key Strengths –Idea: compile advice into ANN representation for RL –Advice expressed in terms of constraint-based knowledge –Like KBANN, achieves knowledge refinement through ANN training Key Weaknesses –Like KBANN, lose semantic clarity of advice, policy, explanations –How to evaluate “refinement” effectively? Quantitatively? Logically? Future Research Issues –How to retain semantic clarity of explanations, DT, knowledge representation –Applications: intelligent agents, web mining (spiders, search engines), games –See work by: Shavlik, Maclin, Stone, Veloso, Sun, Sutton, Pearl, Kuipers

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Classes 13-14: Reinforcement Learning Over Time Paper –Topic: Temporal-Difference Reinforcement Learning –Title: TD Models: Modeling the World at a Mixture of Time Scales –Author: Sutton –Presenter: Vrushali Koranne Key Strengths –Idea: combine state-action evaluation function (Q) estimates over multiple time steps of lookahead –Effective temporal credit assignment (TCA) –Biologically plausible (simulates TCA aspects of dopaminergic system) Key Weaknesses –TCA methodology is effective but semantically hard to comprehend –Slow convergence: can knowledge help? How will we judge? Future Research Issues –How to retain clarity, improve convergence speed, of multi-time RL models –Applications: control systems, robotics, game playing –See work by: Sutton, Barto, Mitchell, Kaelbling, Smyth, Shafer, Goldberg

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Classes 15-16: Generative Neural Models Paper –Topic: Pattern Recognition using Unsupervised ANNs –Title: The Wake-Sleep Algorithm for Unsupervised Neural Networks –Authors: Hinton, Dayan, Frey, and Neal –Presenter: Prasanna Jayaraman Key Strengths –Idea: use 2-phase algorithm to generate training instances (“dream” stage) and maximize conditional probability of data given model (“wake” stage) –Compare: expectation-maximization (EM) algorithm –Good for image recognition Key Weaknesses –Not all data admits this approach (small samples, ill-defined features) –Not immediately clear how to use for problem-solving performance elements Future Research Issues –Studying information theoretic properties of Helmholtz machine –Applications: image/speech/signal recognition, document categorization –See work by: Hinton, Dayan, Frey, Neal, Kirkpatrick, Hajek, Gharahmani

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Classes 17-18: Modularity in Neural Systems Paper –Topic: Combining Models using Modular ANNs –Title: Modular and Hierarchical Learning Systems –Authors: Jordan and Jacobs –Presenter: Afrand Agah Key Strengths –Idea: use interleaved EM update steps to update expert, gating components –Effect: forces specialization among ANN components (GLIMs); boosts performance of single experts; very fast convergence in some cases –Explores modularity in neural systems (artificial and biological) Key Weaknesses –Often cannot achieve higher accuracy than ML, MAP, Bayes optimal estimation –Doesn’t provide experts that specialize in spatial, temporal pattern recognition Future Research Issues –Constructing, selecting mixtures of other ANN components (not just GLIMs) –Applications: pattern recognition, time series prediction –See work by: Jordan, Jacobs, Nowlan, Hinton, Barto, Jaakola, Hsu

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Class 19: Introduction to Probabilistic Reasoning Architectures –Bayesian (Belief) Networks Tree structured, polytrees General –Decision networks –Temporal variants (beyond scope of this course) Parameter Estimation –Maximum likelihood (MLE), maximum a posteriori (MAP) –Bayes optimal classification, Bayesian learning References: Chapter 6, Mitchell; Chapters 14-15, 19, Russell and Norvig Research Issues: How to… –Learn from observation, rewards and penalties, and advice –Distribute rewards and penalties through learning model, over time –Generate pseudo-experiential training instances in pattern recognition –Partition learning problems on the fly, via (mixture) parameter estimation

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Classes 20-21: Approaches to Uncertain Reasoning Paper –Topic: The Case for Probability –Title: In Defense of Probability –Author: Cheeseman –Presenter: Pallavi Paranjape Key Strengths –Idea: probability is mathematically sound way to represent uncertainty –Views of probability considered: objectivist, frequentist, logicist, subjectivist –Argument made for meta-subjectivist belief measure concept of probability Key Weaknesses –Highly dogmatic view without concrete justification for all assertions –Does not quantitatively, empirically compare Bayesian, non-Bayesian methods Future Research Issues –Integrating symbolic and numerical (statistical) models of uncertainty –Applications: uncertain reasoning, pattern recognition, learning –See work by: Cheeseman, Cox, Good, Pearl, Zadeh, Dempster, Shafer

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Classes 22-23: Learning Bayesian Network Structure Paper –Topic: Learning Bayesian Networks from Data –Title: Learning Bayesian Network Structure from Massive Datasets –Authors: Friedman, Pe'er, Nachman –Presenter: Jincheng Gao Key Strengths –Idea: can use graph constraints, scoring functions to select candidate parents in constructing directed graph model of probability (BBN) –Tabu search, greedy score-based methods (K2), etc. also considered Key Weaknesses –Optimal Bayesian network structure learning still intractable for conventional (single-instruction sequential) architectures –More empirical comparison among alternative methods warranted Future Research Issues –Scaling up to massive real-world data sets (e.g., medical, agricultural, DSS) –Applications: diagnosis, troubleshooting, user modeling, intelligent HCI –See work by: Friedman, Goldszmidt, Heckerman, Cooper, Beinlich, Koller

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Classes 24-25: Bayesian Networks for User Modeling Paper –Topic: Decision Support Systems and Bayesian User Modeling –Title: The Lumiere Project: Bayesian User Modeling for Inferring theGoals and Needs of Software Users –Authors: Horvitz, Breese, Heckerman, Hovel, Rommelse –Presenter: Yuhui (Cathy) Liu Key Strengths –Idea: BBN model is developed from user logs, used to infer mode of usage –Can infer goals, skill level of user Key Weaknesses –Need high accuracy in inferring goals to deliver meaningful content –May be better to use next-generation search engine (more interactivity, less passive monitoring) Future Research Issues –Designing better interactive user modeling –Applications: clickstream monitoring, e-commerce, web search, help –See work by: Horvitz, Breese, Heckerman, Lee, Huang

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Classes 26-27: Causal Reasoning Paper –Topic: KDD and Causal Reasoning –Title: Symbolic Causal Networks for Reasoning about Actions and Plans –Authors: Darwiche and Pearl –Presenter: Yue Jiao Key Strengths –Idea: use BBN to represent symbolic constraint knowledge –Can use to generate mechanistic explanations Model actions Model sequences of actions (plans) Key Weaknesses –Integrative methods (numerical, symbolic BBNs) still need exploration –Unclear how to incorporate methods for learning to plan Future Research Issues –Reasoning about systems –Applications: uncertain reasoning, pattern recognition, learning –See work by: Horvitz, Breese, Heckerman, Lee, Huang

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Classes 28-29: Knowledge Discovery from Scientific Data Paper –Topic: KDD for Scientific Data Analysis –Title: KDD for Science Data Analysis: Issues and Examples –Authors: Fayyad, Haussler, and Stolorz –Presenter: Arulkumar Elumalai Key Strengths –Idea: investigate how and whether KDD techniques (OLAP, learning) scale up to huge data sets –Answer: “it depends” – on computational complexity, many other factors Key Weaknesses –Haven’t developed clear theory yet of how to assess “how much data is really needed” –No technical treatment or characterization of data cleaning Future Research Issues –Data cleaning (aka data cleansing), pre- and post-processing (OLAP) –Applications: intelligent databases, visualization, high-performance CSE –See work by: Fayyad, Smyth, Uthurusamy, Haussler, Foster

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Classes 30-31: Relevance Determination Paper –Topic: Relevance Determination in KDD –Title: Irrelevant Features and the Subset Selection Problem –Authors: John, Kohavi, and Pfleger –Presenter: DingBing Yang Key Strengths –Idea: cast problem of choosing relevant attributes (given “top-level” learning problem specification) as search –Effective state space search (A/A*-based) approach demonstrated Key Weaknesses –May not have good enough heuristics! –Can either develop them (via information theory) or use MCMC methods Future Research Issues –Selecting relevant data channels from continuous sources (e.g., sensors) –Applications: bioinformatics (genomics, proteomics, etc.), prognostics –See work by: Kohavi, John, Rendell, Donoho, Hsu, Provost

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Classes 32-33: Learning for Text Document Categorization Paper –Topic: Text Documents and Information Retrieval (IR) –Title: Hierarchically Classifying Documents using Very Few Words –Authors: Koller and Sahami –Presenter: Yan Song Key Strengths –Idea: use rank-frequency scoring methods to find “keywords that make a difference” –Break into meaningful hierarchy Key Weaknesses –Sometimes need to derive semantically meaningful cluster labels –How to integrate this method with dynamic cluster segmentation, labeling? Future Research Issues –Bayesian architectures using “non-Bayesian” learning algorithms (e.g., GAs) –Applications: digital libraries (hierarchical, distributed dynamic indexing), intelligent search engines, intelligent displays (and help indices) –See work by: Koller, Sahami, Roth, Charniak, Brill, Yarowsky

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Classes 34-35: Web Mining Paper –Topic: KDD and The Web –Title: Learning to Extract Symbolic Knowledge from the World Wide Web –Authors: Craven, DiPasquo, Freitag, McCallum, Mitchell, Nigam, and Slattery –Presenter: Ping Zou Key Strengths –Idea: build probabilistic model of web documents using “keywords that matter” –Use probabilistic model to represent knowledge for indexing into web database Key Weaknesses –How to account for concept drift? –How to explain and express constraints (e.g., “proper nouns that are person names don’t matter”)? Not considered here… Future Research Issues –Using natural language processing (NLP), image / audio / signal processing –Applications: searchable hypermedia, digital libraries, spiders, other agents –See work by: McCallum, Mitchell, Roth, Sahami, Pratt, Lee

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Class 36: Introduction to Evolutionary Computation Architectures –Genetic algorithms (GAs), genetic programming (GP), genetic wrappers –Simple vs. parameterless GAs Issues –Loss of diversity Consequence: collapse of Pareto front Solutions: niching (sharing, preselection, crowding) –Parameterless GAs –Other issues (not covered): genetic drift, population sizing, etc. References: Chapter 9, Mitchell; Chapters 1-6, Goldberg; Chapter 1-5, Koza Research Issues: How to… –Design GAs based on credit assignment system (in performance element) –Build hybrid analytical / inductive learning GP systems –Use GAs to perform relevance determination in KDD –Control diversity in GA solutions for hard optimization problems

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Class 37-38: Genetic Algorithms and Classifier Systems Paper –Topic: Classifier Systems and Inductive Learning –Title: Generalization in the XCS Classifier System –Author: Wilson –Presenter: Elizabeth Loza-Garay Key Strengths –Idea: incorporate performance element (classifier system) into GA design –Solid theoretical foundation: advanced building block (aka schema) theory –Can use to engineer more efficient GA model, tune parameters Key Weaknesses –Need to progress from toy problems (e.g., MUX learning) to real-world ones –Need to investigate scaling up of GA principles (e.g., building block mixing) Future Research Issues –Building block scalability in classifier systems –Applications: reinforcement learning, mobile robotics, other animats, a-life –See work by: Wilson, Goldberg, Holland, Booker

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Class 39-40: Knowledge-Based Genetic Programming Paper –Topic: Genetic Programming and Multistrategy Learning –Title: Genetic Programming and Deductive-Inductive Learning: A Multistrategy Approach –Authors: Aler, Borrajo, and Isasi –Presenter: Yuhong Cheng Key Strengths –Idea: use knowledge-based system to calibrate starting state of MCMC optimization system (here, GP) –Can incorporate knowledge (as in CIS830 Part 1 of 5) Key Weaknesses –Generalizability of HAMLET population seeding method not well established –“General-purpose” problem solving systems can become Rube Goldberg-ian Future Research Issues –Using multistrategy GP systems to provide knowledge-based decision support –Applications: logistics (military, industrial, commercial), other problem solving –See work by: Aler, Borrajo, Isasi, Carbonell, Minton, Koza, Veloso

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Class 41-42: Genetic Wrappers for Inductive Learning Paper –Topic: Genetic Wrappers for KDD Performance Enhancement –Title: Simultaneous Feature Extraction and Selection Using a Masking Genetic Algorithm –Authors: Raymer, Punch, Goodman, Sanschagrin, Kuhn –Presenter: Karthik K. Krishnakumar Key Strengths –Idea: use GA to empirically (statistically) validate inducer –Can use to select, synthesize attributes (aka features) –Can also use to tune other GA parameters (hence “wrapper”) Key Weaknesses –Systematic experimental studies of genetic wrappers have not yet been done –Wrappers don’t yet take performance element into explicit account Future Research Issues –Improving supervised learning inducers (e.g., in MLC++) –Applications: better combiners; feature subset selection, construction –See work by: Raymer, Punch, Cherkauer, Shavlik, Freitas, Hsu, Cantu-Paz

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Class 43-44: Genetic Algorithms for Optimization Paper –Topic: Genetic Optimization and Decision Support –Title: A Niched Pareto Optimal Genetic Algorithm for Multiobjective Optimization –Authors: Horn, Nafpliotis, and Goldberg –Presenter: Li Lian Key Strengths –Idea: control representation of neighborhoods Pareto optimal front by niching –Gives abstract and concrete case studies of niching (sharing) effects Key Weaknesses –Need systematic exploration, characterization of “sweet spot” –Shows static comparisons, not small-multiple visualizations that led to them Future Research Issues –Biologically (ecologically) plausible models –Applications: engineering (ag / bio, civil, computational, environmental, industrial, mechanical, nuclear) optimization; computational life sciences –See work by: Goldberg, Horn, Schwefel, Punch, Minsker, Kargupta

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Class 45: Meta-Summary Data Mining / KDD Problems –Business decision support Classification Recommender systems –Control and policy optimization Data Mining / KDD Solutions: Machine Learning, Inference Techniques –Models Version space, decision tree, perceptron, winnow ANN, BBN, SOM Q functions GA/GP building blocks (schemata), GP building blocks –Algorithms Candidate elimination, ID3, delta rule, MLE, Simple (Naïve) Bayes K2, EM, backprop, SOM convergence, LVQ, ADP, simulated annealing Q-learning, TD( ) Simple GA, GP