1 / 7

Markov Models

Markov Models. notes for CSCI-GA.2590 Prof. Grishman. Markov Model . In principle each decision could depend on all the decisions which came before (the tags on all preceding words in the sentence)

astrid
Télécharger la présentation

Markov Models

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Markov Models notes for CSCI-GA.2590 Prof. Grishman

  2. Markov Model • In principle each decision could depend on all the decisions which came before (the tags on all preceding words in the sentence) • But we’ll make life simple by assuming that the decision depends on only the immediately preceding decision • [first-order] Markov Model • representable by a finite state transition network • Tij = probability of a transition from state i to state j

  3. Finite State Network 0.30 dog: woof 0.50 0.30 start end 0.40 0.40 cat: meow 0.50 0.30 0.30

  4. Our bilingual pets • Suppose our cat learned to say “woof” and our dog “meow” • … they started chatting in the next room • … and we wanted to know who said what

  5. Hidden State Network woof meow woof meow dog start end cat

  6. How do we predict • When the cat is talking: ti = cat • When the dog is talking: ti = dog • We construct a probabilistic model of the phenomenon • And then seek the most likely state sequence S

  7. Hidden Markov Model • Assume current word depends only on current tag

More Related