1 / 74

Smart Home Technologies

Smart Home Technologies. Data Mining and Prediction. Objectives of Data Mining and Prediction. Large amounts of sensor data have to be “interpreted” to acquire knowledge about tasks that occur in the environment Patterns in the data can be used to predict future events

cathy
Télécharger la présentation

Smart Home Technologies

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Smart Home Technologies Data Mining and Prediction

  2. Objectives of Data Mining and Prediction • Large amounts of sensor data have to be “interpreted” to acquire knowledge about tasks that occur in the environment • Patterns in the data can be used to predict future events • Knowledge of tasks facilitates the automation of task components to improve the inhabitants’ experience

  3. Data Mining and Prediction • Data Mining attempts to extract patterns from the available data • Associative patterns What data attributes occur together ? • Classification What indicates a given category ? • Temporal patterns What sequences of events occur frequently ?

  4. Example Patterns • Associative pattern When Bob is in the living room he likes to watch TV and eat popcorn with the light turned off. • Classification Action movie fans like to watch Terminator, drink beer, and have pizza. • Sequential patterns After coming out of the bedroom in the morning, Bob turns off the bedroom lights, then goes to the kitchen where he makes coffee, and then leaves the house.

  5. Data Mining and Prediction • Prediction attempts to form patterns that permit it to predict the next event(s) given the available input data. • Deterministic predictions If Bob leaves the bedroom before 7:00 am on a workday, then he will make coffee in the kitchen. • Probabilistic sequence models If Bob turns on the TV in the evening then he will 80% of the time go to the kitchen to make popcorn.

  6. Objective of Prediction in Intelligent Environments • Anticipate inhabitant actions • Detect unusual occurrences (anomalies) • Predict the right course of actions • Provide information for decision making • Automate repetitive tasks e.g.: prepare coffee in the morning, turn on lights • Eliminate unnecessary steps, improve sequences e.g.: determine if will likely rain based on weather forecast and external sensors to decide if to water the lawn.

  7. What to Predict • Behavior of the Inhabitants • Location • Tasks / goals • Actions • Behavior of the Environment • Device behavior (e.g. heating, AC) • Interactions

  8. Example: Location Prediction • Where will Bob go next? • Locationt+1 = f(x) • Input data x: • Locationt, Locationt-1, … • Time, date, day of the week • Sensor data

  9. Example: Location Prediction

  10. Example: Location Prediction • Learned pattern • If Day = Monday…Friday & Time > 0600 & Time < 0700 & Locationt = Bedroom Then Locationt+1 = Bathroom

  11. Prediction Techniques • Classification-Based Approaches • Nearest Neighbor • Neural Networks • Bayesian Classifiers • Decision Trees • Sequential Behavior Modeling • Hidden Markov Models • Temporal Belief Networks

  12. Classification-Based Prediction • Problem • Input: State of the environment • Attributes of the current state inhabitant location, device status, etc. • Attributes of previous states • Output: Concept description • Concept indicates next event • Prediction has to be applicable to future examples

  13. Instance-Based Prediction: Nearest Neighbor • Use previous instances as a model for future instances • Prediction for the current instance is chosen as the classification of the most similar previously observed instance. • Instances with correct classifications (predictions) (xi,f(xi)) are stored • Given a new instance xq, the prediction is derived as the one of the most similar instance xk: f(xq) = f(xk)

  14. Example: Location Prediction

  15. Nearest Neighbor Example: Inhabitant Location • Training Instances (with concept): ((Bedroom, 6:30), Bathroom), ((Bathroom, 7:00), Kitchen), ((Kitchen, 7:30), Garage), ((Garage, 17:30), Kitchen), … • Similarity Metric: d((location1, time1), (location2, time2)) = 1000*(location1 location2) + | time1 – time2 | • Query Instance: xq = (Bedroom, 6:20) • Nearest Neighbor: xk = (Bedroom, 6:30) d(xk, xq) = 10 • Prediction f(xk): Bathroom

  16. Nearest Neighbor • Training instances and similarity metric form regions where a concept (prediction) applies: • Uncertain information and incorrect training instances lead to incorrect classifications

  17. k-Nearest Neighbor • Instead of using the most similar instance, use the average of the k most similar instances • Given query xq, estimate concept (prediction) using majority of k nearest neighbors • Or, estimate concept by establishing the concept with the highest sum of inverse distances:

  18. k-Nearest Neighbor Example • TV viewing preferences • Distance Function? • What are the important attributes ? • How can they be compared ?

  19. k-Nearest Neighbor Example • Distance function example: • Most important matching attribute: Show name • Second most important attribute: Time • Third most important attribute: Genre • Fourth most important attribute: Channel • Does he/she like to watch Nova ?

  20. Nearest Neighbor • Advantages • Fast training (just store instances) • Complex target functions • No loss of information • Problems • Slow at query time (have to evaluate all instances) • Sensitive to correct choice of similarity metric • Easily fooled by irrelevant attributes

  21. Decision Trees • Use training instances to build a sequence of evaluations that permits to determine the correct category (prediction) If Bob is in the Bedroom then if the time is between 6:00 and 7:00 then Bob will go to the Bathroom else • Sequence of evaluations are represented as a tree where leaves are labeled with the category

  22. Decision Tree Induction • Algorithm (main loop) • A = best attribute for next node • Assign A as attribute for node • For each value of A, create descendant node • Sort training examples to descendants • If training examples perfectly classified, then Stop, else iterate over descendants

  23. Decision Tree Induction • Best attribute based on information-theoretic concept of entropy • Choose the attribute that reduces the entropy (~uncertainty) most A1 A2 v1 v2 v1 v2 ? ? B K Bathroom (25) Kitchen (25) Bathroom (25) Kitchen (25) Bathroom (50) Kitchen (0) Bathroom (0) Kitchen (50)

  24. Decision Tree Example: Inhabitant Location Day Sun M…F Sat Time > 6:00 yes no Time < 7:00 yes no Locationt Locationt … … Bedroom Bedroom Bathroom Living Room

  25. Example: Location Prediction

  26. Decision Trees • Advantages • Understandable rules • Fast learning and prediction • Lower memory requirements • Problems • Replication problem (each category requires multiple branches) • Limited rule representation (attributes are assumed to be locally independent) • Numeric attributes can lead to large branching factors

  27. Artificial Neural Networks • Use a numeric function to calculate the correct category. The function is learned from the repeated presentation of the set of training instances where each attribute value is translated into a number. • Neural networks are motivated by the functioning of neurons in the brain. • Functions are computed in a distributed fashion by a large number of simple computational units

  28. Neural Networks

  29. Computer vs. Human Brain

  30. Artificial Neurons • Artificial neurons are a much simplified computational model of neurons • Output: • A function is learned by adjusting the weights wj

  31. Artificial Neuron • Activation functions

  32. Perceptrons • Perceptrons use a single unit with a threshold function to distinguish two categories

  33. Perceptron Learning • Weights are updated based on the treaining instances (x(i), f(x(i))) presented. • Adjusts the weights in order to move the output closer to the desired target concept. • Learning rate  determines how fast to adjust the weights (too slow will require many training steps, too fast will prevent learning).

  34. Limitation of Perceptrons • Learns only linearly-separable functions • E.g. XOR can not be learned

  35. Feed forward Networks with Sigmoid Units • Networks of units with sigmoid activation functions can learn arbitrary functions

  36. Feed forward Networks with Sigmoid Units • General Networks permit arbitrary state-based categories (predictions) to be learned

  37. Learning in Multi-Layer Networks: Error Back-Propagation • As in Perceptrons, differences between the output of the network and the target concept are propagated back to the input weights. • Output errors for hidden units are computed based on the propagated errors for the inputs of the output units. • Weight updates correspond to gradient descent on the output error function in weight space.

  38. Neural Network Examples • Prediction • Predict steering commands in cars • Modeling of device behavior • Face and object recognition • Pose estimation • Decision and Control • Heating and AC control • Light control • Automated vehicles

  39. Neural Network Example:Prediction of Lighting • University of Colorado Adaptive Home [DLRM94] • Neural network learns to predict the light level after a set of lights are changed • Input: • The current light device levels (7 inputs) • The current light sensor levels (4 inputs) • The new light device levels (7 inputs) • Output: • The new light sensor levels (4 outputs) [DLRM94] Dodier, R. H., Lukianow, D., Ries, J., & Mozer, M. C. (1994). A comparison of neural net and conventional techniques for lighting control.Applied Mathematics and Computer Science, 4, 447-462.

  40. Neural Networks • Advantages • General purpose learner (can learn arbitrary categories) • Fast prediction • Problems • All inputs have to be translated into numeric inputs • Slow training • Learning might result in a local optimum

  41. Bayes Classifier • Use Bayesian probabilities to determine the most likely next event for the given instance given all the training data. • Conditional probabilities are determined from the training data.

  42. Naive Bayes Classifier • Bayes classifier required estimatingP(x|f) for all x and f by counting occurrences in the training data. • Generally too complex for large systems • Naive Bayes classifier assumes that attributes are statistically independent

  43. Bayes Classifier • Advantages • Yields optimal prediction (given the assumptions) • Can handle discrete or numeric attribute values • Naive Bayes classifier easy to compute • Problems • Optimal Bayes classifier computationally intractable • Naive Bayes assumption usually violated

  44. Bayesian Networks • Bayesian networks explicitly represent the dependence and independence of various attributes. • Attributes are modeled as nodes in a network and links represent conditional probabilities. • Network forms a causal model of the attributes • Prediction can be included as an additional node. • Probabilities in Bayesian networks can be calculated efficiently using analytical or statistical inference techniques.

  45. Day Time Get ready Room Prediction Bayesian Networks Example:Location Prediction • All state attributes are represented as nodes. • Nodes can include attributes that are not observable. P(Bathroom | R, Gr)

  46. Bayesian Networks • Advantages • Efficient inference mechanism • Readable structure • For many problems relatively easy to design by hand • Mechanisms for learning network structure exist • Problems • Building network automatically is complex • Does not handle sequence information

  47. Sequential Behavior Prediction • Problem • Input: A sequence of states or events • States can be represented by their attributes inhabitant location, device status, etc. • Events can be raw observations Sensor readings, inhabitant input, etc. • Output: Predicted next event • Model of behavior has to be built based on past instances and be usable for future predictions.

  48. Sequence Prediction Techniques • String matching algorithms • Deterministic best match • Probabilistic matching • Markov Models • Markov Chains • Hidden Markov Models • Dynamic Belief Networks

  49. String-Based Prediction • Use the string of previous events or states to find a part that matches the current history. • Prediction is either the event that followed the best (longest) matching string or the most likely event to follow strings partially matching the history. • Issues: • How to determine quality of match ? • How can such a predictor be represented efficiently if the previous event string is long ?

  50. Example System: IPAM [DH98] • Predict UNIX commands issued by a user • Calculate p(xt,xt-1) based on frequency • Update current p(Predicted, xt-1) by  • Update current p(Observed, xt-1) by 1-  • Weight more recent events more heavily • Data • 77 users, 2-6 months, >168,000 commands • Accuracy less than 40% for one guess, but better than Naïve Bayes Classifier [DH98] B. D. Davison and H. Hirsh. Probabilistic Online Action Prediction. Intelligent Environments: Papers from the AAAI 1998 Spring Symposium, Technical Report SS-98-02, pp. 148-154: AAAI Press.

More Related