1 / 25

Talking Points

Talking Points. Joseph Ramsey. LiNGAM. Most of the algorithms included in Tetrad (other than KPC) assume causal graphs are to be inferred from conditional independence tests. Usually tests that assume linearity and Gaussianity. LiNGAM uses a different approach.

genna
Télécharger la présentation

Talking Points

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Talking Points Joseph Ramsey

  2. LiNGAM • Most of the algorithms included in Tetrad (other than KPC) assume causal graphs are to be inferred from conditional independence tests. • Usually tests that assume linearity and Gaussianity. • LiNGAM uses a different approach. • Assumes linearity and non-Gaussianity. • Runs Independent Components Analysis (ICA) to estimate the coefficient matrix. • Rearranges the coefficient matrix to get a causal order. • Prunes weak coefficients by setting them to zero.

  3. ICA • Although complicated, the basic idea is very simple. • a11 X1 + ... + a1n Xn = e1 • ... • an1 X1 + ... + annXn = en • Assume e1,...,en are i.i.d. • Try to maximize the non-Gaussianity of • w1 X1 + ... + wnXn= ? • There are n ways to do it up to symmetry! (Cf. Central Limit Theorem, Hyavarinen et al., 2002) • You can use the coefficients for e1, or for e2, or for... • All other linear combinations of e1,...,en are more Gaussian.

  4. ICA • This equation is usually denoted Wx = s • But also X = BX + s where B is the coefficient matrix • So Wx = (I – B)x = e • s is the vector of independent components • x is the vector of variables • Just showed that under strong conditions we can estimate W. • So we can estimate B! (But with unknown row order) • Using assumptions of linearity and non-Gaussianity (of all but one variable) alone. • More sophisticated analyses allow errors to be non-i.i.d.

  5. LiNGAM • LiNGAM runs ICA to estimate the coefficient matrix B. • The order of the errors is not fixed by ICA, so some rearranging of the B matrix needs to be done. • Rows of the B matrix are swapped so the it is lower triangular. • a[i][j] should be non-zero (representing an edge) just in case ij • Typically, a cutoff is used to determine if a matrix element is zero. • The rearranged matrix corresponds to the idea of a causal order.

  6. LiNGAM • Once you know which nodes are adjacent in the graph and what the causal order is, you can infer a complete DAG. • Review: • Use data from a linear non-Gaussian model (all but one variable non-Gaussian) • Infer a complete DAG (more than a pattern!)

  7. Generalized SEM • In order to try LiNGAM we first need to simulate some linear non-Gaussian data, for which we will need to use the Generalized SEM Model. • The Generalized SEM is a generalization of the linear SEM model. • Allows for arbitrary connection functions • Allows for arbitrary distributions • Simulation from cyclic models supported.

  8. Hand On • Create a DAG. • Parameterize it as a Generalized SEM. • Open the Generalized SEM and select Apply Templates from the Tools menu. • Apply the default template to variables, which will make them all linear functions. • For errors, select a non-Gaussian distribution, such as U(0, 1). • Save.

  9. Hand On • Attach a Generalized SEM IM. • Attach a data set, simulate 1000 points. • Attach a Search box and run LiNGAM. • Attach another search box to Data and run PC. • Compare PC to LiNGAM.

  10. Special Variants of Algorithms • PC Pattern • PC Pattern enforces the requirement that the output of the algorithm will be a pattern. • PCD • PCD adds corrective code to PC for the case where some variables stand in deterministic relationships. • This results in fewer edges being removed from the graph. • For example, if X _||_ Y | Z but Z determines Y, X---Y is not taken out.

  11. Special Variants of Algorithms • CPC • The PC algorithm may jump too quickly to the conclusion that a collider and noncolliders should be oriented, • X->Y<-Z, X---Y---Z • The CPC algorithm uses a much more conservative test for colliders and noncolliders, double and triple checking to make sure they should be oriented, against different adjacents to X and to Z. • The result is a graph with fewer but more accurate orientations.

  12. Hands On • Simulate data from a “complicated” DAG using a SEM IM. • Choose the Search from Simulated Dataitem from the Templates menu. • Make a random 20 node 20 edge DAG. • Parameterize as a linear SEM, accepting defaults. • Run CPC. • Attach another search box to data. • Run PC. • Layout the PC graph using Fruchterman-Reingold. • Copy the layout to the CPC graph. • Open PC and CPC simultaneously and note the differences.

  13. Special Variants of Algorithms • CFCI • Same idea as for CPC but for FCI instead. • KPC • The PC algorithm typically uses independence tests that assume linearity. • The KPC algorithm makes two changes: • It uses a non-parametric independence test. • It adds some steps to orient edges that are unoriented in the PC pattern.

  14. Special Variants of Algorithms • PcLiNGAM • If some variables are Gaussian (more than one), others non-Gaussian, this algorithm applies. • Runs PC, then orients the unoriented edges (if possible) using non-Gaussianity. • LiNG • Extends LiNGAM to orient cycles using non-Gaussianity

  15. Special Variants of Algorithms • JCPC • Uses a Markov blanket style test to add/remove individual edges, using CPC style orientation. • Allows individual adjacencies in the graph to be revised from the initial estimate using the PC adjacency search.

  16. Time Series Simulation (Hands On) • Tetrad includes support for doing time series simulations. • First, one creates a time series graph. • Then one parameterizes the time series graph as a SEM. • Then one instantiates the SEM. • Then one simulates data from the SEM Instantiated Model.

  17. Time Series Simulation • One can, e.g., calculate a vector auto-regression for it. (One can do this as well from time series data loaded in.) • Attach a data manipulation box to the data. • Select vector auto-regression. • Attach a search and run GES. • Should give the graph among concurrent variables. • One can create staggered time series data and run GES. • Attach a data manipulation box. • Select create time series data. • Attach a search box and run GES. • Should give the time lag graph with some extra edges in the highest lag.

  18. Command Line Tetrad • We don’t have an extensive command line interface programmed, but what we do have has proven useful to many people. • We have a command line interface for a number of the basic search algorithms in Tetrad. • We also have a command line interface for the IMaGES algorithm. • Some upcoming version of Tetrad will include a more extensive command line interface.

  19. How to get it • Go to the Tetrad downloads directory, • http://www.phil.cmu.edu/projects/tetrad_download/download/ • Look for files beginning with the prefix “tetradcmd-”. • Pick the one with the latest version.

  20. How to run a search at the command line... • Example: • java -jar tetradcmd-4.3.3-1.jar -data munin1.txt -datatype discrete –algorithm fci -depth 3 -significance 0.0

  21. Command line options • -data: Gives the data file • -datatype: continuous or discrete (mixed not supported) • -algorithm: pc, cpc, fci, cfci, ccd, ges • -depth: Default is -1 (unlimited) • -significance: Default is 0.05 • ... Some others.

  22. IMaGES command line • IMaGES (which I’ll talk about) is a more specialized algorithm and uses its own command line interface. • Email me if you’d like to use it.

  23. Tetrad Source • Weregularly get requests for the Tetrad source code. • The secret is, it’s online, freely available, you just have to know where to look! • Again, look in the Tetrad downloads directory • Look for the latest “dist” (distribution) file, unzip it.

  24. Source • All of the code will be in the distribution, except for private project code. • This can be useful if you want to modify or extend algorithms, or if you want to set up specific kinds of testing, or if the command line tools provided are insufficient for your needs.

  25. Java • The source code is in Java, which can be interfaced with several other platforms with a bit of work. • Matlab, R, Mathematica, also can be called from the command line programmatically from various languages. • Also, since it’s in Java, it’s cross-platform compatible, so it will probably run on your machine.

More Related