1 / 22

Part 3 Vector Quantization and Mixture Density Model

Part 3 Vector Quantization and Mixture Density Model. CSE717, SPRING 2008 CUBS, Univ at Buffalo. Vector Quantization. Quantization Represents continuous range of values by a set of discrete values Example: floating-point representation of real numbers in computer Vector Quantization

Télécharger la présentation

Part 3 Vector Quantization and Mixture Density Model

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Part 3 Vector Quantization and Mixture Density Model CSE717, SPRING 2008 CUBS, Univ at Buffalo

  2. Vector Quantization • Quantization • Represents continuous range of values by a set of discrete values • Example: floating-point representation of real numbers in computer • Vector Quantization • Represent a data space (vector space) by discrete set of vectors

  3. Vector Quantizer A mapping from vector space onto a finite subset of the vector space Y = y1,y2,…,yN finite subset of IRk, referred to as the codebook of Q Q is usually determined by training data

  4. Partition of Vector Quantizer The vector space is partitioned into N cells by the vector quantizer

  5. Properties of Partition Vector quantizer Q defines a complete and disjoint partition of IRk into R1,R2,…,RN

  6. Quantization Error • Quantization Error for single vector x is a suitable distance measure • Overall Quantization Error

  7. Nearest-Neighbor Condition The minimum quantization error of a given codebook Y is given by partition y3 x y1 y2

  8. Centroid Condition • Centroid of a cell Ri • is minimized by choosing as the codebook • For Euclidean distance d Centroid

  9. Vector Quantizer Design – General Steps • Determine initial codebook Y0 • Adjust partition of sample data for the current codebook Ym using nearest-neighbor condition • Update the codebook Ym→Ym+1 using centroid condition • Check a certain condition of convergence. If it converges, return the current codebook Ym+1; otherwise go to step 2 N Converge? Y

  10. Lloyd’s Algorithm

  11. Lloyd’s Algorithm (Cont)

  12. LBG Algorithm

  13. LBG Algorithm (Cont)

  14. k-Means Algorithm

  15. k-Means Algorithm (Cont)

  16. Mixture Density Model A mixture model of N random variables X1,…,XN is defined as follows: is a random variables defined on N labels

  17. Mixture Density Model Suppose the p.d.f.’s of X1,…,XN are and then

  18. Example: Gaussian Mixture Model of Two Components Histogram of samples Mixture Density

  19. Estimation of Gaussian Mixture Model ML Estimation (Value X and label are given) Samples in the format of (-0.39, 0), (0.12, 0), (0.94, 1), (1.67, 0), (1.76, 1), … S1 (Subset of ): (0.94, 1), (1.76, 1), … S2 (Subset of ): (-0.39, 0), (0.12, 0), (1.67, 0), …

  20. Estimation of Gaussian Mixture Model EM Algorithm (Value X is given, label is unknown) 1. Choose initial values of 2. E-Step: For each sample xk, label is missing. But we can estimate using its expected value Samples in the format of ; is missing (-0.39), (0.12), (0.94), (1.67), (1.76), … x1 x2

  21. Estimation of Gaussian Mixture Model EM Algorithm (Value X is given, label is unknown) 3. M-Step: We can estimate again using the labels estimated in the E-Step:

  22. Estimation of Gaussian Mixture Model EM Algorithm (Value X is given, label is unknown) 4. Termination: The log likelihood of n samples At the end of m-th iteration, if terminate; otherwise go to step 2 (the E-Step).

More Related