ILMDA: An Intelligent Learning Materials Delivery Agent and Simulation Leen-Kiat Soh, Todd Blank, L. D. Miller, Suzette Person Department of Computer Science.

Slides:



Advertisements
Similar presentations
Testing Relational Database
Advertisements

CONCEPTUAL WEB-BASED FRAMEWORK IN AN INTERACTIVE VIRTUAL ENVIRONMENT FOR DISTANCE LEARNING Amal Oraifige, Graham Oakes, Anthony Felton, David Heesom, Kevin.
Student Learning Strategies for Success in Computer Networking July 06 Student Learning Strategies for Success in Computer Networking By Name Neville Palmer.
Modelling with expert systems. Expert systems Modelling with expert systems Coaching modelling with expert systems Advantages and limitations of modelling.
Towards Adaptive Web-Based Learning Systems Katerina Georgouli, MSc, PhD Associate Professor T.E.I. of Athens Dept. of Informatics Tempus.
Understanding Depth 0f knowledge
Introduction to Programming with Excel and VBA Course Overview.
Eco Imagine The Opportunity of the VESTA- GIS Technological Platform Anders Östman University of Gävle Sweden.
Making Assignment Expectations Clear: Create a Grading Rubric Barb Thompson Communication Skills Libby Daugherty Assessment FOR Student Learning 1.
Creating an SLO or PLO Statement Presented by ORIE Team Summer 2013 Academy for Planning, Assessment, and Research.
REVISED BLOOM’S TAXONOMY BY FAIZA RANI DA MHS PHASE- IV REVISED BLOOM’S TAXONOMY BY FAIZA RANI DA MHS PHASE- IV.
Sage on the Stage or Guide on the Side?. Technology in higher education Technology is more than using a power point in your presentation Technology is.
Chapter 4 DECISION SUPPORT AND ARTIFICIAL INTELLIGENCE
Seminar /workshop on cognitive attainment ppt Dr Charles C. Chan 28 Sept 2001 Dr Charles C. Chan 28 Sept 2001 Assessing APSS Students Learning.
ILMDA: An Intelligent Agent that Learns to Deliver Learning Materials Leen-Kiat Soh Department of Computer Science and Engineering University of Nebraska.
Intelligent Agent for Delivering Learning Materials Department of Computer Science and Engineering University of Nebraska Co-Sponsored by Great Plains.
Case-Based Learning Mechanisms to Deliver Learning Materials Todd Blank, Leen-Kiat Soh, L. D. Miller, Suzette Person Department of Computer Science and.
Marakas: Decision Support Systems, 2nd Edition © 2003, Prentice-Hall Chapter Chapter 7: Expert Systems and Artificial Intelligence Decision Support.
Writing Objectives General Education’s Great Expectations (GE)2 Tamara Rosier, Assistant Director of Assessment Julie Guevara, Assessment and Accreditation.
Developing Intelligent Agents and Multiagent Systems for Educational Applications Leen-Kiat Soh Department of Computer Science and Engineering University.
Principles of High Quality Assessment
ILMDA: Intelligent Learning Materials Delivery Agents Goal The ILMDA project is aimed at building an intelligent agent with machine learning capabilities.
McGraw-Hill/Irwin ©2005 The McGraw-Hill Companies, All rights reserved ©2005 The McGraw-Hill Companies, All rights reserved McGraw-Hill/Irwin.
OF THE COGNITIVE DOMAIN
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Bloom’s Taxonomy Of the Cognitive Domain (Revised) “The Levels of learning” Remember Understand Apply Analyze Evaluate Create.
Formulating objectives, general and specific
Critical Thinking and Argumentation
PS 4021 Introduction to critical thinking. What constitutes critical thinking? Production of an argument about an argument Construct counterarguments.
Bloom’s Critical Thinking Level 1 Knowledge Exhibits previously learned material by recalling facts, terms, basic concepts, and answers.
OCTOBER 2011 Considerations for Assessing Program Objectives Malcolm LeMay Director of Operations College of Business.
1 USING EXPERT SYSTEMS TECHNOLOGY FOR STUDENT EVALUATION IN A WEB BASED EDUCATIONAL SYSTEM Ioannis Hatzilygeroudis, Panagiotis Chountis, Christos Giannoulis.
Writing Student Learning Outcomes Consider the course you teach.
Computer-Based Training Methods
CBR for Fault Analysis in DAME Max Ong University of Sheffield.
The Revised Bloom’s Taxonomy (RBT): Improving Curriculum, Instruction, and Assessment in an Accountability-Driven, Standards-Based World Developed and.
Ferris Bueller: Voodoo Economics Voodoo_Economics_Anyone_Anyone. mp4Voodoo_Economics_Anyone_Anyone. mp4.
Quick Flip Questioning for Critical Thinking Kobets S.A. Lyceum №87.
Human Learning Asma Marghalani.
Metacognition & Higher Order Thinking Skills
Educational Objectives
Student Learning Outcomes
Bloom’s Taxonomy vs. Bloom’s Revised Taxonomy. Bloom’s Taxonomy 1956 Benjamin Bloom, pyschologist Classified the functions of thought or coming to know.
1 Granular Approach to Adaptivity in Problem-based Learning Environment Sally He, Kinshuk, Hong Hong Massey University Palmerston North, New Zealand Ashok.
Learning Streams: A Case Study in Curriculum Integration Mani Mina, Arun Somani, Akhilesh Tyagi, Diane Rover, Matthew Feldmann, and Mack Shelley Iowa State.
Models of Teaching Week 5 – Part 2.
What should our graduates know?. We ask this question when designing Our lectures A test A laboratory exercise for students Out of class assignments A.
BBI3420 PJJ 2009/2010 Dr. Zalina Mohd. Kasim.  Bloom’s taxonomy of Educational Objectives (1956) provides 6 levels of thinking and questioning. A close.
A Decision-Making Tool.  Goal  Educational Objectives  Student Learning Outcomes  Performance Indicators or Criteria  Learning Activities or Strategies.
Chapter 4 Decision Support System & Artificial Intelligence.
BLOOM’S TAXONOMY OF THE COGNITIVE DOMAIN. BLOOM’S TAXONOMY Benjamin Bloom (et al.) created this taxonomy for categorizing levels of abstraction of questions.
Chapter 4. Chapter 4 Determining Learning Outcomes 1. LEARNING OUTCOMES 2. ASSESSMENT 3. TEACHING 4. TECHNOLOGY DEVELOP MATERIALS Buy – Adapt - Develop.
Chapter 1 Introduction to Systems Design and Analysis Systems Analysis and Design Kendall and Kendall Sixth Edition.
Assessment Ice breaker. Ice breaker. My most favorite part of the course was …. My most favorite part of the course was …. Introduction Introduction How.
D ESCRIBING Y OUR L EARNING Unit 5 Seminar. Agenda Unit Objectives Bloom’s Taxonomy Learning Statements Questions.
Virtual Tutor Application v1.0 Ruth Agada Dr. Jie Yan Bowie State University Computer Science Department.
“Intelligent User Interfaces” by Hefley and Murray.
A Guide to Higher Order Thinking Questions. Bloom’s Taxonomy Benjamin Bloom (1956) developed a classification of levels of intellectual behavior in learning.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
Facilitating Higher Order Thinking in Classroom and Clinical Settings Vanneise Collins, PhD Director, Center for Learning and Development Cassandra Molavrh,
Writing Learning Outcomes Best Practices. Do Now What is your process for writing learning objectives? How do you come up with the information?
Course Code : 15ECSC204 Object Oriented Progamming.
BLOOM'S TAXONOMY OF EDUCATIONAL OBJECTIVES From: Benjamin S. Bloom, Taxonomy of Educational Objectives: The Classification of Educational Goals.
Technical Writing and Instructional Design: A great confluence!
Teaching Quality in an individual class: an overview
Academic Writing & Bloom’s Taxonomy
An educational system for medical billers in training
Lecture Teaching Method
BBI3420 PJJ 2009/2010 Dr. Zalina Mohd. Kasim
Writing Learning Outcomes
Presentation transcript:

ILMDA: An Intelligent Learning Materials Delivery Agent and Simulation Leen-Kiat Soh, Todd Blank, L. D. Miller, Suzette Person Department of Computer Science and Engineering University of Nebraska, Lincoln, NE {lksoh, tblank, lmille,

Introduction Traditional Instruction ourworld.compuserve.com/homepages/g_knott/lecturer.gifhttp://battellemedia.com/archives/old%20book%206.gif

Introduction Intelligent Tutoring Systems – Interact with students – Model student behavior – Decided which materials to deliver – All ITS are adaptive, only some learn

Related Work Intelligent Tutoring Systems – PACT, ANDES, AutoTutor, SAM These lack machines learning capabilities – They generally do not adapt to new circumstances – Do not self-evaluate and self-configure their own strategies – Do not monitor usage history of content presented to students

Project Framework Learning material components – A tutorial – A set of related examples – A set of exercise problems

Project Framework Underlying agent assumptions – A student’s behavior is a good indicator how well the student is understanding the topic in question – It is possible to determine the extent to which a student understands the topic by presenting different examples

Methodology ILMDA System – Graphical user interface front-end – MySQL database backend – ILMDA reasoning in-between

Methodology Overall methodology

Methodology Flow of operations Under the hood – Case-based reasoning – Machine Learning – Fuzzy Logic Retrieval – Outcome Function

Learner Model Student Profiling – Student background Relatively static First and last name, major, GPA, interests, etc. – Student activity Real-time behavior and patterns Average number of mouse clicks, time spent in tutorial, number of quits after tutorial, number of successes, etc.

Case-based reasoning Each case contains problem description and solution parameters The casebase is maintained separately from the examples and problems Chooses example or problem for students with most similar solution parameters

Solution Parameters Description TimesViewedThe number of times the case has been viewed DiffLevelThe difficulty level of the case between 0 and 10 MinUseTimeThe shortest time, in milliseconds, a single student has viewed the case MaxUseTimeThe longest time, in milliseconds, a single student has viewed the case AveUseTimeThe average time, in milliseconds, a single student has viewed the case BloomBloom’s Taxonomy Number AveClickThe average number of clicks the interface has recorded for this case LengthThe number of characters in the course content for this case ContentThe stored list of interests for this case

Adaptation Heuristics Adapt the solution parameters for the old case – Based on difference between problem description of old and new cases – Each heuristic is weighted and responsible for one solution parameter – Heuristics are implemented in a rulebase that adds flexibility to our design

Simulated Annealing Used when adaptation process selects an old case that has repeatedly led to unsuccessful outcome Rather than remove old case SA is used to refresh its solution parameters

Implementation End-to-end ILMDA – Applet-based GUI front-end – CBR-powered agent – Backend database system ILMDA simulator

Simulator Consists of two distinct modules – Student Generator Creates virtual students Nine different types student types based on aptitude and speed – Outcome Generator Simulates student interactions and outcomes

Student generator Creates virtual students – Generates all student background values such as names, GPAs, interests, etc – Generates the activity profile such as average time spent on session and average number of mouse clicks using Gaussian distribution

Outcome Generator Simulates student interaction and outcomes – Determines the time spent and the number of clicks for one learning material – Also determines whether a virtual student quits the learning material and answers it successfully

Simulation 900 students, 100 from each type – Step 1: 1000 iterations with no learning – Step 2: 100 iterations with learning – Step 3: 1000 iterations again with no learning Results – Between Steps 1 and 3, average problem scores increased from to – Between Steps 1 and 3, the number of examples given increased twofold

Future Work Deploy the ILMDA system to the introductory CS core course – Fall 2004(done) – Spring 2005(done) – Fall 2005 Add fault determination capability – Students || Agent Reasoning || Content at fault

Questions

Responses I Blooms Taxonomy (Cognitive) – Knowledge: Recall of data. – Comprehension: Understand the meaning, translation, interpolation, and interpretation of instructions and problems. State a problem in one's own words. – Application: Use a concept in a new situation or unprompted use of an abstraction. Applies what was learned in the classroom into novel situations in the workplace. – Analysis: Separates material or concepts into component parts so that its organizational structure may be understood. Distinguishes between facts and inferences. – Synthesis: Builds a structure or pattern from diverse elements. Put parts together to form a whole, with emphasis on creating a new meaning or structure. – Evaluation: Make judgments about the value of ideas or materials.

Responses II Outcome function (example or problem) – Ranges from 0..1 – Quitting at tutorial or example results in 0 for outcome – Otherwise, compare average clicks and times for student with those for example or problem