1 / 26

A Sparsification Approach for Temporal Graphical Model Decomposition

A Sparsification Approach for Temporal Graphical Model Decomposition. Ning Ruan Kent State University. Joint work with Ruoming Jin (KSU), Victor Lee (KSU) and Kun Huang (OSU). Motivation: Financial Markets. Fluorescence Counts. Protein-Protein Interaction.

diella
Télécharger la présentation

A Sparsification Approach for Temporal Graphical Model Decomposition

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Sparsification Approach for Temporal Graphical Model Decomposition Ning Ruan Kent State University Joint work with Ruoming Jin (KSU), Victor Lee (KSU) and Kun Huang (OSU)

  2. Motivation: Financial Markets

  3. Fluorescence Counts Protein-Protein Interaction Microarray time series profile Motivation: Biological Systems 3

  4. Vector Autoregression t= 0 1 2 3 4 T Univariate Autoregression is self-regression for a time-series VAR is the multivariate extension of autoregression 4

  5. Granger Causality • Goal: reveal causal relationship between two univariate time series. • Y is Granger causal for X at time tif Xt-1 and Yt-1 together are a better predictor for Xt than Xt-1 alone. • i.e., compare the magnitude of error ε(t) vs. ε′(t) 5

  6. Temporal Graphical Modeling • Recover the causal structure among a group of relevant time series X1 X2 Φ12 X2 X1 X7 X6 X3 X4 X3 X5 X4 X6 X8 X5 X7 X8 temporal graphical model

  7. The Problem • Given a temporal graphical model, can we decomposeit to get a simplerglobal view of the interactions among relevant time series? How to interpret these causal relationships???

  8. X1 X1 X2 X3 X4 X5 X6 X7 X8 X2 X3 X4 X5 X6 X7 X8 Extra Benefit X1 X3 X2 X4 X5 X6 X7 X8 Clustering based on similarity X2 X1 X7 X6 X3 X4 X8 X5 Consider time series clustering from a new perspective!

  9. submatrix Clustered Regression Coefficient Matrix • Vector Autoregression Model • Φ(u) is a NxN coefficient matrix • Clustered Regression Coefficient Matrix • ifΦ(u)ij≠0,then time series i and j are in the same cluster • if time series i and j are not in the same cluster,then Φ(u)ij=0

  10. Temporal Graphical Model Decomposition Cost • Goal: preserve prediction accuracy while reducing representation cost • Given a temporal graphical model, the cost for model decomposition is • Problem • Tend to group all time series into one cluster prediction error L2 penalty

  11. Refined Cost for Decomposition C1 • Balance size of clusters • C is NxK membership matrix • Overall cost is the sum of three parts • Optimal Decomposition Problem • Find a cluster membership matrix C and its regression coefficient matrix Φ such that the cost for decomposition is minimal X2 prediction error size constraint L2 penalty

  12. Hardness of Decomposition Problem • Combined integer (membership matrix) and numerical (regression coefficient matrix) optimization problem • Large number of unknown variables • NxK variables in membership matrix • NxN variables in regression coefficient matrix

  13. Basic Idea for Iterative Optimization Algorithm • Relax binary membership matrix C to probabilistic membership matrix P • Optimize membership matrix while fixing regression coefficient matrix • Optimize regression coefficient matrix while fixing membership matrix • Employ two optimization steps iteratively to get a local optimal solution

  14. Time Series Data Temporal Graphical Model Optimize cluster membership matrix Optimize regression coefficient matrix Quasi-Newton Method Generalized ridge regression Step 1 Step 2 Overview of Iterative Optimization Algorithm

  15. Step 1: Optimize Membership Matrix • Apply Lagrange multiplier method: • Quasi-Newton method • Approximate Hessian matrix by iteratively updating

  16. constant Step 2: Optimize Regression Coefficient Matrix • Decompose cost functions into N subfunctions • Generalized Ridge Regression • yk is a vector related with P and X (length L) • Xk is a matrix related with P and X (size LxN) k=1, traditional ridge regression

  17. Update Hessian Matrix takes Compute coefficient matrix Complexity Analysis Step 1 is the computational bottleneck of entire algorithm N NxK NxK+N N NxK+N

  18. Basic Idea for Scalable Approach • Utilize variable dependence relationship to optimize each variable (or a small number of variables) independently, assuming other relationships are fixed • Convert the problem to a Maximal Weight Independent Set (MWIS) problem

  19. Experiments: Synthetic Data • Synthetic data generator • Generate community-based graph as underlying temporal graphical model [Girvan and Newman 05] • Assign random weights to graphical model and generate time series data using recursive matrix multiplication [Arnold et al. 07] • Decomposition Accuracy • Find a matching between clustering results and ground-truth clusters such that the number of intersected variables are maximal • The number of intersected variables over total number of variables is decomposition accuracy

  20. Experiments: Synthetic Data (cont.) • Applied algorithms • Iterative optimization algorithm based on Quasi-Newton method (newton) • Iterative optimization algorithm based on MWIS method (mwis) • Benchmark 1: Pearson correlation test to generate temporal graphical model, and Ncut [Shi00] for clustering (Cor_Ncut) • Benchmark 2: directed spectral clustering [Zhou05] on ground-truth temporal graphical model (Dcut)

  21. Experimental Results: Synthetic • On average, newton is better than Cor_Ncut and Dcut by 27% and 32%, respectively • On average, mwis is better than Cor_Ncut and Dcut by 24% and 29%, respectively

  22. Experimental Results: Synthetic mwis is better than Cor_Ncut by an average of 30% mwis is better than Dcut by an average of 52%

  23. Experiment: Real Data • Data • Annual GDP growth rate (downloaded from http://www.ers.usda.gov/Data/Macroeconomics) • 192 countries • 4 Time periods • 1969-1979 • 1980-1989 • 1990-1999 • 1998-2007 • Hierarchically bipartition into 6 or 7 clusters

  24. Experimental Result: Real Data

  25. Summary • We formulate a novel objective function for the decomposition problem in temporal graphical modeling. • We introduce an iterative optimization approach utilizing Quasi-Newton method and generalized ridge regression. • We employ a maximum weight independent set based approach to speed up the Quasi-Newton method. • The experimental results demonstrate the effective and efficiency of our approaches.

  26. Thank you

More Related