1 / 10

An Introduction of Independent Component Analysis (ICA)

An Introduction of Independent Component Analysis (ICA). Xiaoling Wang Jan. 28, 2003. What Is ICA?. Application: blind source separation (BSS) and deconvolution Motivation: “cocktail party problem” Assumption: two people speaking simultaneously, two microphones in different locations.

alika
Télécharger la présentation

An Introduction of Independent Component Analysis (ICA)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. An Introduction of Independent Component Analysis (ICA) Xiaoling Wang Jan. 28, 2003

  2. What Is ICA? • Application: blind source separation (BSS) and deconvolution • Motivation: “cocktail party problem” • Assumption: two people speaking simultaneously, two microphones in different locations

  3. Principles of ICA Algorithm • Assumption: sources are statistically independent • Goal: it seeks a transformation to coordinates in which the data are maximally statistically independent • Definition: Mixing process Demixing process – mixing matrix, – separation matrix

  4. Hierarchy of ICA Models Nonlinear mixing Non-stationary mixing Linear mixing Non-stationary sources Non-Gaussian sources Gaussian sources No noise Independent Factor analysis Classical ICA Factor Analysis R diagonal Approximations to mutual information Cumulant based methods Flexible Source model Switching source model Probabilistic PCA Fixed source model Kurtosis minimization No noise FastICA Infomax PCA orthogonal mixing

  5. Independence of Sources • Independence: the pdf of sources can be factorized • Nongaussian is independent • Seek the separation matrix W which maximize the nongaussianity of the estimated sources

  6. Measures of Nongaussianity • Kurtosis (4th order cumulant): • Subgaussian: negative kurtosis • Supergaussian: positive kurtosis • Negentropy: entropy differential entropy negentropy

  7. Measures of Nongaussianity (Cont.) • Mutual information: For ,

  8. FastICA Algorithm • Basic form: • Choose an initial (e.g. Random) weight vector • Let • Let • If not converged, go back to step 2 • For several units: decorrelation • Let • Let

  9. Nonlinear ICA • Model: • Existence and uniqueness of solutions • There always exists an infinity of solutions if the space of the nonlinear mixing functions is not limited • Post-nonlinear problem mixing demixing

  10. Algorithms for Nonlinear ICA • Burel’s approach: neural solution, known nonlinearities on unknown parameters • Krob & Benidir: high order moments, polynomial mixtures • Pajunen et al.: SOMs, locally factorable pdf • Pajunen et al.: GTM(generative topographic mapping), output distribution matches the known source distributions • Post nonlinear mixtures: • Taleb & Jutten: adaptive componentwise separation • Yang et al.: two-layer neural network • Puntonet et al.: nonlinearities are a power function, geometrical considerations

More Related