1 / 14

HMM-BASED PATTERN DETECTION

HMM-BASED PATTERN DETECTION. Image Processing and Reconstruction Winter 2002. Outline. Markov Process Hidden Markov Models Elements Basic Problems Evaluation Optimization Training Implementation 2-D HMM Application Simulation and Results. Markov Process.

siran
Télécharger la présentation

HMM-BASED PATTERN DETECTION

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. HMM-BASED PATTERN DETECTION Image Processing and Reconstruction Winter 2002

  2. Outline • Markov Process • Hidden Markov Models • Elements • Basic Problems • Evaluation • Optimization • Training • Implementation • 2-D HMM • Application • Simulation and Results

  3. Markov Process Can be described at any time to be in one state among N distinct states Its probabilistic description just requires a fixed specification of current and previous states actual state at time t state transition probability Each state corresponds to a physical (observable) event Too restrictive for sophisticated applications S2 S3 S1 a31

  4. Extension to Hidden Markov Models • A conditionally independent process on a Markov chain • States correspond to clusters of context with similar distribution • Elements of HMM: • State transition probability • The observation symbol probability in each state • The initial state distribution

  5. Fundamental Problems for HMM • Evaluation the probability of the observation O=O1O2…OT given the model , P(O| ) • Optimization Choosing optimal state sequence given the observation and the model . • Training Estimating model parameters to maximize P(O| )

  6. Evaluation the Model; Forward-Backward Algorithm This calculation is on order of Forward-Backward Procedure with order of • Forward variable: • Backward variable:

  7. Optimal States Sequence; Solution(s) • One solution: choose the states which are individually most likely. This optimal solution has to be a valid state sequence!! • Vitterbi Algorithm: find the single best state sequence that maximizes P(Q|O,)

  8. Training the Model

  9. Continuous Observation Distributions • In most of the applications (Speech, Image, …), observations can not be characterized as discrete symbols from finite alphabet and should be considered by probability density function (PDF). • The most general representation of the PDF is a finite mixture of normal distributions with different means and variances for each state. • Estimating mean and variance instead of estimating bj(k)

  10. Implementation Considerations • Scaling: Dynamic range of  and  will exceed the precision range of any machine • Multiple observations for training • Initial Estimation of HMM Parameters for convergence, good initial values of PDF are really helpful. • Choice of Model, Number of states, Choice of observation PDF

  11. Two-Dimensional HMM • Set of Markovian states within each super-state • Transition probability • Useful for segmentation Sub-State Si-1,j Si,j-1 Si,j Super-State

  12. Application: Pattern Detection SNR=-5 SNR=10

  13. Simulations • Feature Vector: DCT Coefficients or their averages over some of them Block Size: 16*16 • Both images in training set and test set have different rotation of “jinc”s, but the distance and center of them are fixed. • Running K-means Clustering Algorithm For initial estimation • Comparing with template matching and Learning Vector Quantization • Distance measure for LVQ: is the computed variance of each coefficients in reference centroid Average of Absolute value of the Coefficients

  14. Results and Conclusion! Detection Error

More Related