1 / 18

Non-negative Matrix Factorization with Sparseness Constraints

Non-negative Matrix Factorization with Sparseness Constraints. Patrik O . Hoyer. Journal of Machine Learning Research,2004. Jain- De,Lee. OutLINE. Introduction Adding Sparseness Constraints to NMF Experiments with Sparseness Constraints Conclusions. Introduction.

cachet
Télécharger la présentation

Non-negative Matrix Factorization with Sparseness Constraints

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Non-negative Matrix Factorization with Sparseness Constraints Patrik O. Hoyer Journal of Machine Learning Research,2004 Jain-De,Lee

  2. OutLINE • Introduction • Adding Sparseness Constraints to NMF • Experiments with Sparseness Constraints • Conclusions

  3. Introduction • Non-negative matrix factorization (NMF) • A useful representation typically makes latent structure in the data explicit • Reduces the dimensionality of the data • The non-negativity constraints make the representation purely additive

  4. Introduction • Sparserepresentation • Representation encodes much of the data using few ‘active’ components • The sparseness given by NMF is somewhat of a side-effect rather than a goal • Include the option to control sparseness explicitly

  5. Adding Sparseness Constraints to NMF Illustration of various degrees of sparseness • The concept of sparse coding • Only a few units are effectively used to represent typical data vectors

  6. Adding Sparseness Constraints to NMF where n is the dimensionality of x • Sparseness measure • Based on the relationship between the L1 norm and the L2 norm

  7. Adding Sparseness Constraints to NMF Where wiis the ithcolumn of W and hiis the ithrow of H Swand Share the desired sparsenesses of W and H (respectively) • To constrain NMF to find solutions with desired degrees of sparseness • What exactly should be sparse? • Under optional constraints minimized

  8. Adding Sparseness Constraints to NMF Initialize Project Iterate • Projected gradient descent algorithm for NMF with sparseness constraints

  9. Adding Sparseness Constraints to NMF Project • If sparseness constraints on W: • Project each column of W to be non-negative • Have unchanged L2 norm, L1 norm set to achieve desired sparseness • If sparseness constraints on H: • Project each row of H to be non-negative • Have unchanged L2 norm, L1 norm set to achieve desired sparseness

  10. Adding Sparseness Constraints to NMF or Where μWand μHare small positive constants • Iterate • If sparseness constraints on W(or H) apply • Set or • Project • else take standard multiplicative step

  11. Adding Sparseness Constraints to NMF • Given any vector x, find the closest non-negative vector s with a given L1 norm and a given L2 norm • Projection operator • Problem

  12. Adding Sparseness Constraints to NMF if if • Algorithm • Set • Set Z={} • Iterate • Set • Set ,where α≥ 0 • If all components of s are non-negative, return s, end • Set Z=Z ∪ {i ; si<0} • Set si=0 , • Calculate • Set si= si –c , • Go to 1

  13. Experiments with Sparseness Constraints NMF applied to various image data sets (a) Basis images given by NMF applied to face image data from the CBCL database (b) Basis images derived from the ORL face image database (c) Basis vectors from NMF applied to ON/OFF-contrast filtered natural image data

  14. Experiments with Sparseness Constraints Features learned from the CBCL face image database using NMF with sparseness constraints

  15. Experiments with Sparseness Constraints Features learned from the ORL face image database using NMF with three levelssparseness constraints (a) 0.5 (b) 0.6 (c) 0.75

  16. Experiments with Sparseness Constraints The sparseness of the coefficients was fixed at 0.85 Standard NMF (Figure 1c)

  17. Experiments with Sparseness Constraints Number of iterations required for the projection algorithm to converge

  18. Conclusions • Useful to control the degree of sparseness explicitly • Describe a projection operator capable of simultaneously enforcing both L1 and L2 norms • To show its use in the NMF framework for learning representations that could not be obtained by regular NMF

More Related