1 / 44

Chapter 5

Finite State Machines Transducers Markov Models Hidden Markov Models Büchi Automata. Chapter 5. Deterministic Finite State Transducers. A Moore machine M = ( K ,  , O ,  , D , s , A ), where: ● K is a finite set of states ●  is an input alphabet ● O is an output alphabet

gyula
Télécharger la présentation

Chapter 5

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Finite State MachinesTransducersMarkov ModelsHidden Markov ModelsBüchi Automata Chapter 5

  2. Deterministic Finite State Transducers A Moore machineM = (K, , O, , D, s, A), where: ● K is a finite set of states ●  is an input alphabet ● O is an output alphabet ● sK is the initial state ● AK is the set of accepting states, ●  is the transition function from (K) to (K), ● D is the output function from (K) to (O*). M outputs each time it lands in a state. A Moore machine M computes a function f(w) iff, when it reads the input string w, its output sequence is f(w).

  3. A Simple US Traffic Light Controller

  4. Deterministic Finite State Transducers A Mealy machineM = (K, , O, , s, A), where: ● K is a finite set of states ●  is an input alphabet ● O is an output alphabet ● sK is the initial state ● AK is the set of accepting states ●  is the transition function from (K) to (KO*) M outputs each time it takes a transition. A Mealy machine M computes a function f(w) iff, when it reads the input string w, its output sequence is f(w).

  5. An Odd Parity Generator After every three bits, output a fourth bit such that each group of four bits has odd parity. 0 0 1 1 1 0 0 0 0 1 1 1

  6. A Bar Code Scanner

  7. A Bar Code Scanner

  8. A Deterministic Finite State Transducer Interpreter Let: 1(state, symbol) return a single new state, and 2(state, symbol) return an element of O*. ST := s. Repeat i := get-next-symbol. If i end-of-file then: Write(2(ST, i)). ST := 1(ST, i). Until i = end-of-file. If STA then accept else reject.

  9. Bidirectional Transducers Letter substitution:

  10. English Morphology

  11. Soundex • 1. If two or more adjacent letters would map to the same number, remove • all but the first. • 2. Set first letter to first letter of the name. • 3. For all other letters of the name do: • 3.1 Convert the letters B, P, F, V, C, S, G, J, K, Q, X, Z, D, T, L, M, • N, and R to numbers: • B, P, F, V = 1 • C, S, G, J, K, Q, X, Z = 2 • D, T = 3 • L = 4 • M, N = 5 • R = 6 • 3.2 Delete all instances of the letters A, E, I, O, U, Y, H, and W. • 4. If the string contains more than three numbers, delete all but the leftmost three. • 5. If the string contains fewer than three numbers, pad with 0’s on the right to get three.

  12. Soundex http://www.cs.utexas.edu/~ear/cs341/automatabook/phoneticstringmatching_link.html?http://resources.rootsweb.com/cgi-bin/soundexconverter Try: Kaylor

  13. Soundex

  14. Stochastic FSMs

  15. Markov Models A Markov model is a triple M = (K, , A): ● K is a finite set of states ●  is a vector of initial probabilities ● A[p, q] = Pr(state q at time t | state p at t - 1)

  16. Markov Models What is the probability that it will be sunny five days in a row?

  17. Markov Models What is the probability that it will be sunny five days in a row? .4(.75)4 = .1266

  18. Markov Models To use a Markov model, we first need to use data to create the matrix A. What can we do with a Markov model? ● Generate almost natural behavior. ● Estimate the probability of some outcome.

  19. Estimating Probabilities Given a Markov model that describes some random process, what is the probability that we will observe a particular sequence S1S2 … Sn of states?

  20. Modeling System Performance If up now, what is probability of staying up for an hour?

  21. Modeling System Performance If up now, what is probability of staying up for an hour (3600 time steps)? =

  22. Modeling System Performance If up now, what is probability of staying up for an hour (3600 time steps)? = .953600 = 6.382310-81

  23. An Example from Biology What happens to AB pairs? We compute the probability that the model, if it starts in state AB and runs for n generations, will land in state AB. That probability is .5n:

  24. Nth Order Markov Models • 0th order models depend on no prior state. • 1st order models depend on one previous state. • … • nth order models depend on n previous states.

  25. Musikalisches Würfelspiel A musical dice game A 0th order model

  26. A Letter-Level Model of English ● (k = 1): a a idjume Thicha lanbede f nghecom isonys rar t r ores aty Ela ancuny, ithi, witheis        weche ● (k = 2): Ther to for an th she con simach a so a impty dough par we forate for len postrit cal nowillopecide allexis inteme numbectionsityFSM            Cons onste on codere elexpre ther   ● (k = 3): Ouput that the collowing with to that we’ll in which of that is returesult is alway ther is id, the cal on the Prove be and N. ● (k = 4): Notice out at least to steps if new Turing derived for explored.  What this to check solved each equal string it matrix (i, k, y must be put part can may generated grammar in D. ● (k = 5): So states, and Marting rules of strings.  We may have been regions to see, a list.  If ? ? unrestricted grammars exist a devices are constructive-state i back to computation ● (k = 6): We’ll have letter substituted languages that L(G) since we drop the address to the rule1 were counterexample, that is that we are true when we switched in how we ● (k = 7): If it does a context-free language 3.  If the model of which corresponding b’s.  M must have chosen strings as a tree such characters of some p.

  27. A Word-Level Model of English ● (k = 1): there exists at the idea was presented for some finite state 3 together. So U begins in this approach, it is labeled with wj as some model to position-list, solve-15 can reduce every derivation becomes M1 and the number of A building efficient algorithms. ● (k = 2): The language to be if the various grammar formalisms in which they were deposited make no moves are possible. The competition can come from somewhere. Fortunately, there are edges from level nodes to level nodes. Now suppose that we do with a successful value. ●(k = 4): Again, let st be the working string at any point in its computation it will have executed only a finite number of squares can be nonblank. And, even if M never halts, at any point in its computation it will have executed only a finite number of choices at each derivation step and since each path that is generated must eventually end, the Turing machine M that computes it. ● (k = 5): Is there any computational solution to the problem? • If there is, can it be implemented using some fixed amount of memory? • If there is no such element, then choose will: • Halt and return False if all the actions halt and return False. • Fail to halt if there is no mechanism for determining that no elements of S that satisfy P exist. This may happen either because v and y are both nonempty and they both occur in region n

  28. Hidden Markov Models Suppose that the states themselves are not visible. But states emit outputs with certain probabilities and the outputs are visible: If we could view the states:

  29. When We Cannot View the States ● The decoding problem: We observe the report ###L from London. ● The evaluation problem: We observe the report ###L from somewhere. ● The learning problem.

  30. Hidden Markov Models An HMM M is a quintuple (K, O, , A, B), where: ● K is a finite set of states, ● O is the output alphabet, ●  is a vector of initial probabilities of the states, ● A is a matrix of transition probabilities: A[p, q] = Pr(state q at time t | state p at time t – 1), ● B, the confusion matrix of output probabilities. B[q, o] = Pr(output o | state q).

  31. The Decoding Problem We observe the report ###L from London. What was the weather? We use the Viterbi algorithm: candidate-score[Sunny] = score[Sunny, 0]  A[Sunny, Sunny]  B[Sunny, #] = .55  .75  .3 = .12 candidate-score[Rainy] = score[Rainy, 0]  A[Rainy, Sunny]  B[Rainy, #] = .45  .3  .8 = .11 So score[Sunny, 1] = max(.12, .11) = .12, and back(Sunny, 1) is set to Sunny.

  32. An Example from Biology The sequence matching problem: A G H T Y W D N R A G H D T Y E N N R Y Y P A G Q D T Y W N N A G H D T T Y W N N

  33. Büchi Automata A Büchi automaton is a quintuple (K, , , S, A), where: ● K is a finite set of states, ●  is the input alphabet, ● S  K is a set of initial states, * ● A  K is the set of accepting states, and ●  is the transition relation, a finite subset of: (K  )  K * A computation of a Büchi automaton M is an infinite sequence C0, C1, … such that: ● C0 is an initial configuration, and ● C0 |-MC1 |-MC2 |-M … M accepts a string w   iff, in at least one of its computations, there is some accepting state q such that, when processing w, M enters q an infinite number of times. The language accepted by M, denoted L(M), is the set of all strings accepted by M.

  34. Examples of Büchi automata Let  = {a, b, c, d, e}. Event e must occur at least once:

  35. Examples of Büchi Automata Let  = {a, b, c, d, e}. There must come a point after which only e’s can occur:

  36. Examples of Büchi automata Let  = {a, b, c, d, e}. There must come a point after which only e’s can occur:

  37. Examples of Büchi automata Let  = {a, b, c, d, e}. Every c event must be immediately followed by an e event:

  38. Examples of Büchi Automata Let  = {a, b, c, d, e}. Every c event must be immediately followed by an e event:

  39. Mutual Exclusion CR0 will be True iff process0 is in its critical region. CR1 will be True iff process1 is in its critical region. Legal inputs: {(CR0CR1), (CR0CR1), True}. Accepts all and only the input sequences that satisfy the property that (CR0CR1) never occurs:

  40. Nondeterminism Matters L = {w {a, b} : #b(w) is finite}. A nondeterministic Büchi automaton:

  41. Nondeterminism Matters L = {w {a, b} : #b(w) is finite}. Converting the ND machine to deterministic: What’s wrong?

  42. Nondeterminism Matters L = {w {a, b} : #b(w) is finite}. What happens on input (ba)?

  43. Closure Properties of Büchi Automata The Büchi-acceptable languages are closed under: • Concatenation with a regular language: if L1 is a regular language and L2 is a Büchi-acceptable language, then L1L2 is Büchi-acceptable. • Union: if L1 and L2 are Büchi-acceptable, then L1L2 is also Büchi-acceptable. • Intersection: if L1 and L2 are Büchi-acceptable, then L1L2 is also Büchi-acceptable. • Complement: if L is Büchi-acceptable, then L is also Büchi-acceptable.

  44. Decision Procedures for Büchi Automata There exist decision procedures for all of the following properties: • Emptiness: Given a Büchi automaton B, is L(B) empty? • Nonemptiness: Given a Büchi automaton B, is L(B) nonempty? • Inclusion: Given two Büchi automata B1 and B2, is L(B1) L(B2)? • Equivalence: Given two Büchi automata B1 and B2, is L(B1) = L(B2)?

More Related