1 / 10

Understanding Alpha and Beta Probabilities in Hidden Markov Models

This lecture explores the concepts of forward (α) and backward (β) probabilities in Hidden Markov Models (HMMs) as part of an introduction to artificial intelligence. It covers the recursive expressions for α and β probabilities, the initial conditions, and how they relate to the likelihood of observing a sequence of states. Using the example of the sequence "bbba", the lecture provides a clear understanding of how these probabilities are computed and their significance in HMMs, laying the groundwork for further applications in AI.

greg
Télécharger la présentation

Understanding Alpha and Beta Probabilities in Hidden Markov Models

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS344 : Introduction to Artificial Intelligence Pushpak BhattacharyyaCSE Dept., IIT Bombay Lecture 24- Expressions for alpha and beta probabilities

  2. r q A Simple HMM a: 0.2 a: 0.3 b: 0.2 b: 0.1 a: 0.2 b: 0.1 b: 0.5 a: 0.4

  3. Forward or α-probabilities Let αi(t) be the probability of producing w1,t-1, while ending up in state si αi(t)= P(w1,t-1,St=si), t>1

  4. Initial condition on αi(t) 1.0 if i=1 αi(t)= 0 otherwise

  5. Probability of the observation using αi(t) P(w1,n) =Σ1 σP(w1,n, Sn+1=si) = Σi=1 σ αi(n+1) σis the total number of states

  6. Recursive expression for α αj(t+1) =P(w1,t, St+1=sj) =Σi=1 σP(w1,t, St=si,St+1=sj) =Σi=1 σP(w1,t-1, St=sj) P(wt,St+1=sj|w1,t-1, St=si) =Σi=1 σP(w1,t-1, St=si) P(wt,St+1=sj|St=si) = Σi=1 σαj(t) P(wt,St+1=sj|St=si)

  7. The forward probabilities of “bbba”

  8. Backward or β-probabilities Let βi(t) be the probability of seeingwt,n, given that the state of the HMM at t is si βi(t)= P(wt,n,St=si)

  9. Probability of the observation using β P(w1,n)=β1(1)

  10. Recursive expression for β βj(t-1) =P(wt-1,n|St-1=sj) =Σj=1 σP(wt-1,n, St=si |St-1=si) =Σi=1 σP(wt-1,St=sj|St-1=si)P(wt,n,|wt-1,St=sj, St-1=si) =Σi=1 σP(wt-1,St=sj|St-1=si)P(wt,n, |St=sj) (consequence of Markov Assumption) = Σj=1 σ P(wt-1,St=sj|St-1=si) βj(t)

More Related