1 / 29

Learning to Detect Events with Markov-Modulated Poisson Processes

Ihler, Hutchins and Smyth (2007). Learning to Detect Events with Markov-Modulated Poisson Processes. Outline. Problem: Finding unusual activity ( events ) in rhythms of natural human activity Method: Unsupervised learning

jsherrie
Télécharger la présentation

Learning to Detect Events with Markov-Modulated Poisson Processes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Ihler, Hutchins and Smyth (2007) Learning to Detect Events with Markov-Modulated Poisson Processes

  2. Outline • Problem: Finding unusual activity (events) in rhythms of natural human activity • Method: • Unsupervised learning • Time-varying Poisson process modulated by a hidden Markov process (events) • Bayesian framework for parameter learning

  3. Why is it hard? • Chicken-and-egg problem • Where do we start? • Previous approaches: baseline • Simple threshold model • Has severe limitations • Need to quantify the notion of an unusual activity • How unusual is a measurement • How persistent is a deviating measurement

  4. The Data Sets • 2 data sets used • Building data • Counts of people entering and exiting a building • 15 weeks of data • 30 minute time bins • 29 known events in the 15 weeks • Freeway Traffic data • Vehicle counts on a freeway on-ramp • 6 months of data • 5 minute time bins • 78 known events in the 6 months

  5. Building Data • Example day

  6. Example week Building Data

  7. Example day Freeway Traffic Data

  8. Example week Freeway Traffic Data

  9. A naïve Poisson model • Is the data actually Poisson? • In a Poisson distribution the mean = the variance • Is this the case in out data?

  10. A Baseline Model • Use a simple threshold approach • We say there is an event if • P(N;λ) < ε

  11. Problems with this Approach • Hard to detect sustained small variation • Hard to capture event duration • Chicken and egg problem

  12. The model (1) Assuming the processes are additive ...which is a fair assumption

  13. The model (2)

  14. A = Rainy B = Sunny What is a Markov Process? 0.1 0.5

  15. Modelling Events with a Markov Process • We define a three state Markov chain • z(t) is the state at time t, the 3 possible states are • 0 if there is no event • +1 if there is a positive event • -1 if there is a negative even • With transition matrix

  16. Details of the Markov Process • We give each row in the transition matrix a Dirichlet prior: • Given z(t), we can model NE(t) as a Poisson with rate γ(t). We give this a Gamma prior Γ(γ;aE,bE), which is independent of t • We can then marginalize out over γ(t):

  17. Graphical Model of the Dependencies

  18. Learning the parameters • If we are given the hidden variables N0(t), NE(t) and z(t), we can: • compute MAP estimates • draw posterior samples of the parameters λ(t) and Mz • So, we can use MCMC; iterate between sampling from the hidden variables (given the parameters), and the parameters (given the variables)

  19. Sampling the hidden variables, given the parameters Rough outline: • First, use forward-backward algorithm [Baum et al. 1970] to sample z(t) • Then given z(t), determine N0(t) and NE(t) by sampling

  20. Sampling the parameters, given the hidden variables • The conjugate prior distributions give us a straightforward way to compute the posteriors • Use the sufficient statistics of the data as (updating) parameters for the posterior:

  21. Prior distributions of zij and γ(t) • Markov-modulated Poisson processes are sensitive to selection of priors for zij and γ(t) • For the domains of these models, we often have strong ideas on e.g. what constitutes a “rare” event • Use these ideas to build strong priors in the model in order to avoid overfitting, and to adjust threshold levels of event detection

  22. Calculating Results • We are looking to detect unusual events, we can use our model to do this do this by calculating the posterior: • We can then compare our predictions with the known event occurrences

  23. Example Posterior Predictions (1)

  24. Example Posterior Predictions (2)

  25. Example Posterior Predictions (3)

  26. Comparison of Predicted Events with Known Events

  27. Other Possible Inferences • The model can be modified to test the degree of heterogeneity of the time process. We can ask questions like • are all week days essentially the same? • are all afternoons essentially the same? • We can estimate event attendance

  28. Conclusion • Model much more affective than threshold approach • Good detection rate • Difficult to access false positive rate • Possibility for extension

  29. Questions

More Related