1 / 23

LTSI

LTSI. Semi- nonnegative INDSCAL analysis. Ahmad Karfoul (1) , Julie Coloigner (2,3) , Laurent Albera (2,3) , Pierre Comon (4,5). (1) Faculty of Mech. & Elec. Engineering, University AL-Baath, Syria. (2) Laboratory LTSI - INSERM U642, France. (3) University of Rennes 1, France.

kiona
Télécharger la présentation

LTSI

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. LTSI Semi-nonnegative INDSCAL analysis Ahmad Karfoul(1), Julie Coloigner (2,3), Laurent Albera (2,3), Pierre Comon (4,5) (1)Faculty of Mech. & Elec. Engineering, University AL-Baath, Syria (2)Laboratory LTSI - INSERM U642, France (3)University of Rennes 1, France (4)Laboratory I3S - CNRS, France (5)University of Nice Sophia - Antipolis, France

  2. Outlines • Preliminaries and problem formulation • Global line search • Optimization methods • A compact matrix form of derivatives • Numerical results • Conclusion

  3. Preliminaries and problem formulation Outer product Ex. Order 3  Ex. Order q  Outerproduct of q-vectors rank-one q-th ordertensor

  4. Preliminaries and problem formulation : Tensor – to – rectangular matrix transformation (unfolding according to the i-th mode) : Tensor – to – vector transformation 4

  5. λ1 Preliminaries and problem formulation CANonical Decomposition (CAND) [Hitchcock 1927], [Carroll & Chang 1970], [Harshman 1970] CAND : Linear combinantion of minimal number of rank -1 terms λP

  6. λ1 λP Preliminaries and problem formulation INDSCAL decomposition [Carroll & Chang 1970]

  7. λ1 λP λ1 λP Preliminaries and problem formulation CANonical Decomposition (CAND) INDSCAL decomposition INDSCAL = CAND of 3-order tensor symmetric in two of three modes

  8. Case 1 : Nonnegative INDSCAL decomposition Preliminaries and problem formulation (Semi-) nonnegative INDSCAL decomposition for (semi-) nonnegative BSS Example : Diagonalizing a set of covariance matrices : the (N  P) mixing matrix s : zero-mean random vector of P statistically independent components Covariance matrix : where : Case 2 : Semi-nonnegative INDSCAL decomposition

  9. Problem 1 : Given , find its INDSCAL decomposition subject to Problem 2 : Given , find its INDSCAL decomposition Preliminaries and problem formulation Problem at hand Constrainedproblem [Chu et al. 04] • Parametrizing the nonnegativity constraint: :Hadamard product (element-wise product) Unconstrainedproblem with

  10. : Khatri-Rao product Preliminaries and problem formulation • Solution : minimizing the following cost function : with : Some iterative algorithms • Steepest Descent First & second order derivatives of ψ • Newton • Levenberg Marquardt

  11. Optimizationmethods Global line search (1/2) • Looking for the global optimum in a given direction Update rules : : learning steps . : Directions given by the iterative algorithm with respect to A and C, respectively.

  12. Optimizationmethods Global line search (2/2) and • Minimization with respect to • 3-th order symmetric (in two modes) tensor  Global optimum in the considered direction for : Stationary point of a quadratic polynomial : Stationary point of a 24-th degree polynomial •  Global optimum in the considered direction for : Stationary point of a 10-th degree polynomial

  13. Optimizationmethods Steepest Descent (SD) • Optimization by searching for stationary points of Ψ based on first-order approximation (i.e. the gradient) Update rules : : learning steps . In this work : Gradient of ψ with respect to A and C, respectively. • Learning steps are optimal (optimal line search)  Global optimum in the considered direction. • Gradients are given in a compact matrix form .

  14. Optimizationmethods Steepest Descent (SD) A compact matrix form of • Computing the differential of ψ are immediat . where: Then :

  15. Compact matrixform of derivatives Gradient computation of Ψ(A,C) Then : where : a commutation matrix of size (IP×IP) : N-dimensional vector of ones : Identity matrix of size (N×N)

  16. Optimizationmethods Newton • Optimization by including the second-order approximation to accelerate the convergence Update rules : : Hessian of ψ with respect to A and C, respectively. In this work • Learning steps are also computed optimally (Global line search) . • Hessians are given in a compact matrix form .

  17. EVD-based regularization U : Matrix of eigen - vectors Σ = diag{λ1,…,λNP} : diagonal matrix of eigen-values • Compute the ratio • Replace all negative eigen - values by one . • If • Problem : Lack of positive definiteness Lack of convergence & slowness mNewton1 mNewton2 Optimizationmethods Newton • Convergence requirement:Hessians are positive definite matrices • Solution : Necessity to regularization (i.e. Eigen-Value Decomposition (EVD) - based technique )

  18. Based on a linear approximation to the components of , in the neighborhood of A / C. in A . where is the Jacobian of Jacobians are computed from : and Optimizationmethods Levenberg-Marquardt (LM) Update rules : with : : damped parameter influencing both the direction and the size of the step [Madsen et al. 2004]

  19. : Zero-mean normally distributed noise Scalar controling the noise level • : Numericalresults Convergence speed VS SNR • Noise-free random 3-order tensor • Noisy 3-way array : • Results averaged over 200 Monte Carlo’s realizations.

  20. Numericalresults Convergence speed VS SNR SNR = 0 dB

  21. Numericalresults Convergence speed VS SNR SNR = 15 dB

  22. Numericalresults Convergence speed VS SNR SNR = 30 dB

  23. Conclusion • Solving an unconstrained semi-nonnegative INDSCAL problem . • Differential concept  Powerful tool for compact matrix derivations forms • Global line search for symmetric case  global optimum in the considered direction • Iterative algorithms with global line search  suitable step to reach the global optimum Algebraic method + iterative method with global line search global optimum

More Related