1 / 19

Incremental Linear Discriminative Analysis (LDA)

Incremental Linear Discriminative Analysis (LDA). Presenter : Seung Hwan, Bae. Contents. Introduction Sufficient Spanning Set Incremental LDA Updating the total scatter matrix Updating the between scatter matrix Updating the projection matrix Conclusion.

morey
Télécharger la présentation

Incremental Linear Discriminative Analysis (LDA)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Incremental Linear Discriminative Analysis (LDA) Presenter : Seung Hwan, Bae

  2. Contents • Introduction • Sufficient Spanning Set • Incremental LDA • Updating the total scatter matrix • Updating the between scatter matrix • Updating the projection matrix • Conclusion A main reference: T.-K. Kim, B. Stenger, J. Kittler, and R. Cipolla. Incremental linear discriminant analysis using sufficient spanning sets and its applications. IJCV, 91(2):216–232, 2011. 10, 11.

  3. Reconstructive vs. Discriminative • Reconstructive Method • Well approximation of data  good reconstruction • Encompass variability of training data • Task-independent • Unsupervised • Enables incremental updating • E.g. PCA • Finds linear representations that best describes input data • Looks for a low-dimensional representation minimizing the SRE (squared reconstruction error) • Tries to model each image as well as possible.

  4. Reconstructive vs. Discriminative • Discriminative • Separates the data -> good classification • Spatially/computationally efficient • Task-dependent • Supervised, focus on specific prior knowledge • Effective representation • E.g. LDA • Maximize inter-class scatter, minimize intra-class scatter (Belhumer et.al., PAMI’97) • Looks for differences between images of different classes • Hyperplanes separate the training data with no (or) little error -> No reconstruction (low dimensional projections) It too focused on specific discriminative features, representations can not be adapted to new information!

  5. Motivation of Incremental Learning • Copy human visual system • Build representation incrementally • Adapt to changing world • Not all images given in advance • Enable online learning • Reduce amount of storage needed • Keep only image representation • Discard training images • Reduce calculation time • Update representation • No recalculation of model • Aspect of incremental learning • Adding new instances • Adding new classes

  6. Linear Discriminative Analysis (Recall) • Subspace learning using LDA • We create orthogonal projection vectors by eigendecomposing where, : Min (Fisher Criterion) Eigenvalue Problem A within class scatter matrix A between class scatter matrix A total scatter matrix

  7. ILDA: Main Components • Given two sets of data, each represented by an eigenspacemodel, the principle components of the total scatter matrix of the union set is computed by merging the eigenspace model. • Similarly the principle components of the combined between scatter matrix is updated by merging the respective two eigenspace models. • The final step is computed the discriminative components using the updated principle components of the previous step. Eigenvalue Decomposition Space complexity: Time complexity:

  8. Sufficient Spanning Set • A reduced set of basis vectors spanning the space of most data variation of an scatter matrix (or eigenvector matrix) where the reconstruction of data matrix by sufficient spanning set should approximate the original data matrix. • Let where are the eigenvector and eigenvalue matrix corresponding to most energy. Then, where R is an arbitrary rotation matrix can be a sufficient spanning set:

  9. Overall Procedure of ILDA New Data Sets Evaluated Eigen-Components SSS: Sufficient Spanning Set 1) Update TotalEigen-Components using SSS Merging Total Scatter Models Updated LDA Project Matrix 3) Update Discriminative Components using SSS 2) Updated Between Eigen- Components using SSS Merging Between Scatter Models

  10. Sufficient Spanning Set • The union set of the principle components , or , of the two data sets and mean difference vector can span the respective total and between scatter data space, ,. (Left). • The projection and orthogonalization of the original components , yields the principle components of the projected data up to rotation (Right). Figure 1. Concepts of sufficient spanning set of the total scatter matrix (left) and projected matrix (right)

  11. Updating the total scatter matrix • Let denote eigenspace models of two sets as where and are mean vector and the total number of samples in dataset d. and are eigenvector and eigenvalue matrices. • A Combined total scatter matrix

  12. Updating the total scatter matrix • Using sufficient spanning set and rotation matrix computed by QR decomposition, we express the eigenvector matrix where is an orthonormalization function followed by removal of zero vectors. • We then change the decomposition problem of into the smaller problem • By computing the eigendecomposition of r.h.s., eigenvalues and eigenvectors are computed. After removing nonsignificant components, minimal sufficent spanning sets of the combined eigen vectors are obtained .

  13. Updating the between scatter matrix • Let denote eigenspace models of two sets as where, : the number of samples in class of set , : the number of class in set . : coefficient vectors of the k-th class mean vector of set with respect to the subspace spanned by • A combined total scatter matrix where

  14. Updating the between scatter matrix • Similar to update total matrix, the combined between scatter matrix is efficiently computed using sufficient spanning set The combined eigenvectors are obtained after removing the components having zero eigenvalue in .

  15. Updating the projection matrix • We update the projection matrix with updated eigenspace models of the total and between-class scatter matrices. • Let , then as The updated projection matrix is given by .

  16. Pseudo code of ILDA

  17. Experimental Results Cross-correlation of LDA component computed by Batch LDA and Incremental LDA Basis update using Batch LDA and Incremental LDA

  18. Experimental Results Face image data set: The vision 1 MPEG. XM2VTS. Alktom. BANCA. The solution of incremental LDA closely agrees to the batch solution while requiring much lower computation time: (a) Retrieval inaccuracy (b) Computation cost (c) The number time.

  19. Conclusion • ILDA can adapt a discriminative LDA projection matrix with new datasets. • It can greatly reduce time and space complexities, compared to batch LDA.

More Related