Presentation is loading. Please wait.

# Intelligent Environments1 Computer Science and Engineering University of Texas at Arlington.

## Presentation on theme: "Intelligent Environments1 Computer Science and Engineering University of Texas at Arlington."— Presentation transcript:

Intelligent Environments1 Computer Science and Engineering University of Texas at Arlington

Intelligent Environments2 Prediction for Intelligent Environments Motivation Techniques Issues

Intelligent Environments3 Motivation An intelligent environment acquires and applies knowledge about you and your surroundings in order to improve your experience. “acquires”  prediction “applies”  decision making

Intelligent Environments4 What to Predict Inhabitant behavior Location Task Action Environment behavior Modeling devices Interactions

Intelligent Environments5 Example Where will Bob go next? Location t+1 = f(…) Independent variables Location t, Location t-1, … Time, date, day of the week Sensor data Context Bob’s task

Intelligent Environments6 Example (cont.) TimeDateDayLocation t Location t+1 063002/25MondayBedroomBathroom 070002/25MondayBathroomKitchen 073002/25MondayKitchenGarage 173002/25MondayGarageKitchen 180002/25MondayKitchenBedroom 181002/25MondayBedroomLiving room 220002/25MondayLiving roomBathroom 221002/25MondayBathroomBedroom 063002/26TuesdayBedroomBathroom

Intelligent Environments7 Example Learned pattern If Day = Monday…Friday & Time > 0600 & Time < 0700 & Location t = Bedroom Then Location t+1 = Bathroom

Intelligent Environments8 Prediction Techniques Regression Neural network Nearest neighbor Bayesian classifier Decision tree induction Others

Intelligent Environments9 Linear Regression xy 13 25 37 49

Intelligent Environments10 Multiple Regression n independent variables Find b i System of n equations and n unknowns

Intelligent Environments11 Regression Pros Fast, analytical solution Confidence intervals y = a ± b with C% confidence Piecewise linear and nonlinear regression Cons Must choose model beforehand Linear, quadratic, … Numeric variables

Intelligent Environments12 Neural Networks

Intelligent Environments13 Neural Networks 10-10 5 synapses per neuron Synapses propagate electrochemical signals Number, placement and strength of connections changes over time (learning?) Massively parallel

Intelligent Environments14 Computer vs. Human Brain ComputerHuman Brain Computational units1 CPU, 10 8 gates10 11 neurons Storage units10 10 bits RAM, 10 12 bits disk 10 11 neurons, 10 14 synapses Cycle time10 -9 sec10 -3 sec Bandwidth10 9 bits/sec10 14 bits/sec Neuron updates / sec10 6 10 14

Intelligent Environments15 Computer vs. Human Brain “The Age of Spiritual Machines,” Kurzweil.

Intelligent Environments16 Artificial Neuron

Intelligent Environments17 Artificial Neuron Activation functions

Intelligent Environments18 Perceptron

Intelligent Environments19 Perceptron Learning

Intelligent Environments20 Perceptron Learns only linearly-separable functions

Intelligent Environments21 Sigmoid Unit

Intelligent Environments22 Multilayer Network of Sigmoid Units

Intelligent Environments23 Error Back-Propagation Errors at output layer propagated back to hidden layers Error proportional to link weights and activation Gradient descent in weight space

Intelligent Environments24 NN for Face Recognition 90% accurate learning head pose for 20 different people.

Intelligent Environments25 Neural Networks Pros General purpose learner Fast prediction Cons Best for numeric inputs Slow training Local optima

Intelligent Environments26 Nearest Neighbor Just store training data (x i,f(x i )) Given query x q, estimate using nearest neighbor x k : f(x q ) = f(x k ) k nearest neighbor Given query x q, estimate using majority (mean) of k nearest neighbors

Intelligent Environments27 Nearest Neighbor

Intelligent Environments28 Nearest Neighbor Pros Fast training Complex target functions No loss of information Cons Slow at query time Easily fooled by irrelevant attributes

Intelligent Environments29 Bayes Classifier Recall Bob example D = training data h = sample rule

Intelligent Environments30 Naive Bayes Classifier Naive Bayes assumption Naive Bayes classifier y represents Bob’s location

Intelligent Environments31 Bayes Classifier Pros Optimal Discrete or numeric attribute values Naive Bayes easy to compute Cons Bayes classifier computationally intractable Naive Bayes assumption usually violated

Intelligent Environments32 Decision Tree Induction Day Time > 0600 Location t Time < 0700 Bathroom M…F yes Bedroom … no Sat Sun

Intelligent Environments33 Decision Tree Induction Algorithm (main loop) 1. A = best attribute for next node 2. Assign A as attribute for node 3. For each value of A, create descendant node 4. Sort training examples to descendants 5. If training examples perfectly classified, then Stop, else iterate over descendants

Intelligent Environments34 Decision Tree Induction Best attribute Based on information-theoretic concept of entropy Choose attribute reducing entropy (~uncertainty) from parent to descendant nodes A1A2 Bathroom (0) Kitchen (50) Bathroom (50) Kitchen (0) Bathroom (25) Kitchen (25) Bathroom (25) Kitchen (25) ??BK v2v2 v1v1 v1v1 v2v2

Intelligent Environments35 Decision Tree Induction Pros Understandable rules Fast learning and prediction Cons Replication problem Limited rule representation

Intelligent Environments36 Other Prediction Methods Hidden Markov models Radial basis functions Support vector machines Genetic algorithms Relational learning

Intelligent Environments37 Prediction Issues Representation of data and patterns Relevance of data Sensor fusion Amount of data

Intelligent Environments38 Prediction Issues Evaluation Accuracy False positives vs. false negatives Concept drift Time-series prediction Distributed learning

Download ppt "Intelligent Environments1 Computer Science and Engineering University of Texas at Arlington."

Similar presentations

Ads by Google