Chapter 8 Prediction Algorithms for Smart Environments

Slides:



Advertisements
Similar presentations
FUNCTION FITTING Student’s name: Ruba Eyal Salman Supervisor:
Advertisements

2006/12/05ICS Home Automation Examples of WSN: (iPower: An Energy Conservation System for Intelligent Buildings) Yu-Chee Tseng (appeared in ICS 2006)
Cognitive Systems, ICANN panel, Q1 What is machine intelligence, as beyond pattern matching, classification and prediction. What is machine intelligence,
Dynamic Bayesian Networks (DBNs)
CS 795 – Spring  “Software Systems are increasingly Situated in dynamic, mission critical settings ◦ Operational profile is dynamic, and depends.
Hidden Markov Models Reading: Russell and Norvig, Chapter 15, Sections
Vikramaditya Jakkula Washington State University First International Workshop on Smart Homes for Tele-Health.
The Decision-Making Process IT Brainpower
Part 2 of 3: Bayesian Network and Dynamic Bayesian Network.
June 27-28, 2006 Vikramaditya Jakkula Monitoring Health by Detecting Drifts and Outliers for a Smart Environment Inhabitant Gaurav Jain, Diane J. Cook,
Marakas: Decision Support Systems, 2nd Edition © 2003, Prentice-Hall Chapter Chapter 7: Expert Systems and Artificial Intelligence Decision Support.
1 MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING By Kaan Tariman M.S. in Computer Science CSCI 8810 Course Project.
Robotics for Intelligent Environments
Computer Science and Engineering Department The University of Texas at Arlington MavHome: An Intelligent Home Environment.
McGraw-Hill/Irwin ©2005 The McGraw-Hill Companies, All rights reserved ©2005 The McGraw-Hill Companies, All rights reserved McGraw-Hill/Irwin.
Smart Home Technologies CSE 4392 / CSE 5392 Spring 2006 Manfred Huber
Statistical Natural Language Processing. What is NLP?  Natural Language Processing (NLP), or Computational Linguistics, is concerned with theoretical.
INTEGRATION OF ARTIFICIAL INTELLIGENCE [AI] SYSTEMS FOR NUCLEAR POWER PLANT SURVEILLANCE & DIAGNOSTICS.
Data Mining Chun-Hung Chou
Extracting Places and Activities from GPS Traces Using Hierarchical Conditional Random Fields Yong-Joong Kim Dept. of Computer Science Yonsei.
Chapter 11: Artificial Intelligence
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Adaptive Control of House Environment - Neural Network House Presented by Wenjie Zeng.
Using Neural Networks in Database Mining Tino Jimenez CS157B MW 9-10:15 February 19, 2009.
Vikramaditya Jakkula Washington State University IEEE Workshop of Data Mining in Medicine 2007 (DMMed '07) In conjunction with IEEE.
Copyright © 2006, The McGraw-Hill Companies, Inc. All rights reserved. Decision Support Systems Chapter 10.
NEURAL NETWORKS FOR DATA MINING
1 Robot Environment Interaction Environment perception provides information about the environment’s state, and it tends to increase the robot’s knowledge.
1 Delay Tolerant Network Routing Sathya Narayanan, Ph.D. Computer Science and Information Technology Program California State University, Monterey Bay.
January Smart Environments: Artificial Intelligence in the Home and Beyond Diane J. Cook
Vikramaditya Jakkula & Diane J. Cook Artificial Intelligence Lab Washington State University 2 nd International Conference on Technology and Aging (ICTA)
Laboratory for Computational Intelligence, University of British Columbia Belief & Decision Networks Stochastic Local Search Neural NetworksGraph Searching.
Learning from observations
Intelligent Environments1 Conclusions and Future Directions.
Chapter 5: Business Intelligence: Data Warehousing, Data Acquisition, Data Mining, Business Analytics, and Visualization DECISION SUPPORT SYSTEMS AND BUSINESS.
REU 2004 Computer Science and Engineering Department The University of Texas at Arlington Research Experiences for Undergraduates in Distributed Rational.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Reestimation Equations Continuous Distributions.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Reestimation Equations Continuous Distributions.
Chapter 4 Decision Support System & Artificial Intelligence.
1 CONTEXT DEPENDENT CLASSIFICATION  Remember: Bayes rule  Here: The class to which a feature vector belongs depends on:  Its own value  The values.
Smart Home Technologies Data Mining and Prediction.
George F Luger ARTIFICIAL INTELLIGENCE 6th edition Structures and Strategies for Complex Problem Solving Machine Learning: Probabilistic Luger: Artificial.
Rational Agency CSMC Introduction to Artificial Intelligence January 8, 2004.
Prediction of Protein Binding Sites in Protein Structures Using Hidden Markov Support Vector Machine.
Learning and Acting with Bayes Nets Chapter 20.. Page 2 === A Network and a Training Data.
REU 2007 Computer Science and Engineering Department The University of Texas at Arlington Research Experiences for Undergraduates in Information Processing.
1 Chapter 17 2 nd Part Making Complex Decisions --- Decision-theoretic Agent Design Xin Lu 11/04/2002.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Identifying “Best Bet” Web Search Results by Mining Past User Behavior Author: Eugene Agichtein, Zijian Zheng (Microsoft Research) Source: KDD2006 Reporter:
WHAT IS DATA MINING?  The process of automatically extracting useful information from large amounts of data.  Uses traditional data analysis techniques.
Computer Science and Engineering Department The University of Texas at Arlington MavHome: An Intelligent Home Environment.
WHAT IS DATA MINING?  The process of automatically extracting useful information from large amounts of data.  Uses traditional data analysis techniques.
REU 2009 Computer Science and Engineering Department The University of Texas at Arlington Research Experiences for Undergraduates in Information Processing.
ORT Braude College – Software Engineering Department WristQue: A Personal Sensor Wirstband Brian D. Mayton, Nan Zhao, Matt Aldrich, Nicholas Gillian, and.
Data Mining for Hierarchical Model Creation G. Michael Youngblood and Diane J. Cook IEEE Transactions on Systems, Man, and Cybernetics, Part C, 37(4): ,
Computer Science and Engineering Department The University of Texas at Arlington MavHome: An Intelligent Home Environment.
Network Management Lecture 13. MACHINE LEARNING TECHNIQUES 2 Dr. Atiq Ahmed Université de Balouchistan.
Chapter 13 Artificial Intelligence. Artificial Intelligence – Figure 13.1 The Turing Test.
Introduction to Machine Learning, its potential usage in network area,
Brief Intro to Machine Learning CS539
Some tools and a discussion.
Prepared by: Mahmoud Rafeek Al-Farra
CS b659: Intelligent Robotics
Utility Automation of Offices
Chapter 1 -- Overview Technologies Standards Algorithms Protocols
Data Mining Lecture 11.
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
Presentation transcript:

Chapter 8 Prediction Algorithms for Smart Environments MavHome - U Tx Arlington

Smart Environments Design & Implementation requires breadth Integration of disciplines Machine learning Human machine interfaces Decision making Wireless network Mobile communication Databases Sensor nets Pervasive computing

Benefits of Automation Convenience Turn off coffee, warm up car Conservation Manage heat & cooling, lawn watering Do actual work -- Order groceries, vacuum carpet See "Bob's Day" -- pg. 175-176

Role of Prediction Goals of Smart Environment: Maximize comfort Minimize costs Adapt to inhabitants To Attain: Use tools from artificial intelligence Prediction, automatic decision making

Prediction Learn about devices by observation At certain temperature, how long to warm house Utilization of resources, also To cool house, turn on ceiling fan and/or close blinds Predict inhabitant's behavior Hardware: video, power meters, motion detectors, load sensors, device controllers, vital sign monitors Software: prediction algorithms

Prediction Outcomes Determine relevant features Make maximum use of information Minimal prediction errors Minimal delays (quick predictions) Prediction Decision Making Algorithm

Prediction Algorithms * NOTE: Smart environment development is not all hardware * Prediction Task: process of forming an hypothesis representing the future value of a target variable for a given data point. Prediction Algorithm: learns a function that maps known information collected from past/current observations to future point in time.

Prediction Algorithms Based on sequential ordering of events; input to algorithm (plus maybe timing) Historical information + current state => prediction Given event sequence {x1, x2, x3 …. Xj}, what is event xj+1? Approaches Pattern matching (sequence) Markov Decision Process Plan recognition

Sequence Matching - IPAM Just one example algorithm Collects sequential pairs; calculates probability of transitioning from one event to next e.g. {a, b, c, b, c, b, a, a} (a,b) (b,c) (c,b) (b,c) (c,b), (b,a) (a,a) p (a,b) = 1/7 p (b,c) = 2/7 p (a,c) = 0, etc. But, probabilities change over time and are kept in a table

IPAM (continued) When new event xj+1 is observed p (xj, xj+1) increases by factor of (1-α) For some constant α All other p (xj, z) reduced by factor α Weights recent events more heavily Rank events by probability and prediction p (xj+1| xj) Sequence matching algorithms: applied to UNIX command prediction

Markov Decision Process (MDP) At each step, agent perceives environment state, selects action Probability model + possible reward Only last few stated used Unlike pattern matching Hidden Markov Models (HMM) Observable vs. hidden states Figure 8.3, pg. 179 Hidden: current task (in chair sleeping or reading) & health (feel good or tired) Also probabilistic

Plan Recognition Prediction Given a known goal, recognize possible plans to achieve e.g. goal: cool house Plan 1: turn on air conditioner Plan 2: turn on air conditioner and ceiling fans Based on Belief Network DAG: nodes are RV, edge indicates influence Figure 8.4, pg. 180 Includes evidence to support plan generation

Other Prediction Approaches Decision Trees Neural Nets Bayesian Classifiers Nearest Neighbor Algorithm Support Vector Machines

MavHome - Smart Home - UTA Designed as an intelligent agent Goal: comfort of inhabitant & minimize cost of running home Predict, reason, adapt Figure 8.5, pg. 181 -- intelligent agent Figure 8.6, pg. 182 -- architecture 4 layers Decision, information, communication, physical

MavHome - 4 Layers Decision: selects actions for agent to execute Information: collect information, generate inferences for decision making Communication: routes information and requests between agents Physical: contains hardware -- appliances, network, sensors, etc. Bottom-up Process, pg. 182

Identifying Events Need to identify repetitive tasks for potential automation Need to predict next action MavHome: prediction solely on previous interaction with devices plus current state Prediction to decision – algorithm that selects action to execute

Algorithms Repetition modeled as stationary, stochastic process ED (Episode Discovery): identify sequences of regular & repeatable actions that could be used to predict - thru Data Mining Active LeZi: uses sequence matching to predict next action MDL (Minimum Description Length) Principle: pointer to database description of patterns (compression)

Experiment with Algorithms 30 days of data with noise ED discovered 6 significant episodes (pg. 184) Use of the knowledge? Provides understanding of nature of home Patterns will be used in decision making Can improve prediction accuracy

Active LeZi Based on Ziv - Lempel compression (LZ78) Good compression good prediction LZ78 enhanced to improve prediction Calculates probability of each action occurring in a sequence & predicts one with highest Accuracy ≈ 48% With random choice 2% MavHome - Figures pg. 189+

Conclusion Comfort: minimize number of manual interactions with environment Overview of Prediction