1 / 96

Finite-State Automata

Finite-State Automata. Shallow Processing Techniques for NLP Ling570 October 5, 2011. Roadmap. Finite-state automata variants Non-deterministic Finite-State Automata (NFA) Probabilistic Finite-State Automata (PFSA) Finite-State Transducers (FST) Intro CARMEL. FSAs Formally.

lacey
Télécharger la présentation

Finite-State Automata

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Finite-State Automata Shallow Processing Techniques for NLP Ling570 October 5, 2011

  2. Roadmap • Finite-state automata variants • Non-deterministic Finite-State Automata (NFA) • Probabilistic Finite-State Automata (PFSA) • Finite-State Transducers (FST) • Intro • CARMEL

  3. FSAs Formally • A Finite-State Automaton (FSA) is a 5-tuple: • A set of states Q {q0,q1,q2,q3,q4} • A finite alphabet Σ {b,a,!} • A state state q0 • A set of accepting states {q4} • A transition function Q x Σ Q

  4. Non-deterministic Finite-State Automata • Deterministic FSA: • Given current state x input, at most one legal transition

  5. Non-deterministic Finite-State Automata • Deterministic FSA: • Given current state x input, at most one legal transition • Sources of non-determinism in NFA

  6. Non-deterministic Finite-State Automata • Deterministic FSA: • Given current state x input, at most one legal transition • Sources of non-determinism in NFA • More than one transition for qi x a

  7. Non-deterministic Finite-State Automata • Deterministic FSA: • Given current state x input, at most one legal transition • Sources of non-determinism in NFA • More than one transition for qi x a • ε-transitions • Transitions that consume no input

  8. Non-deterministic Finite-State Automata • Deterministic FSA: • Given current state x input, at most one legal transition • Sources of non-determinism in NFA • More than one transition for qi x a • ε-transitions • Transitions that consume no input • Multiple start states • Transitions: • Q x (ΣU ε)  2S

  9. Sheeptalk NFA • Multiple choice:

  10. Sheeptalk NFA • Multiple choice: • ε-transitions

  11. NFA & DFA • What is the relationship b/t NFAs and DFAs?

  12. NFA & DFA • What is the relationship b/t NFAs and DFAs? • NFAs and DFAs are equivalent • Accept same languages

  13. NFA & DFA • What is the relationship b/t NFAs and DFAs? • NFAs and DFAs are equivalent • Accept same languages • Standard transformation from NFA to DFA: • Create new state for each equivalence class in NFA

  14. NFA & DFA • What is the relationship b/t NFAs and DFAs? • NFAs and DFAs are equivalent • Accept same languages • Standard transformation from NFA to DFA: • Create new state for each equivalence class in NFA • If NFA has N states, up to 2N states in DFA

  15. NFA & DFA • What is the relationship b/t NFAs and DFAs? • NFAs and DFAs are equivalent • Accept same languages • Standard transformation from NFA to DFA: • Create new state for each equivalence class in NFA • If NFA has N states, up to 2N states in DFA • Why use both?

  16. Managing Non-Determinism • Approaches to non-determinism

  17. Managing Non-Determinism • Approaches to non-determinism • Backup/backtracking: • At choice points, mark state, input • If fail, return and try alternative

  18. Managing Non-Determinism • Approaches to non-determinism • Backup/backtracking: • At choice points, mark state, input • If fail, return and try alternative • Look-ahead: • Look ahead to see which choice to take

  19. Managing Non-Determinism • Approaches to non-determinism • Backup/backtracking: • At choice points, mark state, input • If fail, return and try alternative • Look-ahead: • Look ahead to see which choice to take • Parallelism: • At choice points, consider all paths in parallel • Basically the NFA  DFA conversion process

  20. Recognition in NFAs • Two standard approaches:

  21. Recognition in NFAs • Two standard approaches: • Transformation: • Convert NFA  DFA • Perform standard DFA recognition

  22. Recognition in NFAs • Two standard approaches: • Transformation: • Convert NFA  DFA • Perform standard DFA recognition • Search: • Explicitly model recognition as state-space search • Perform backtracking as necessary

  23. Recognition in NFA • For strings in the language:

  24. Recognition in NFA • For strings in the language: • There exists some state sequence leading to a final state

  25. Recognition in NFA • For strings in the language: • There exists some state sequence leading to a final state • Not all paths necessarily lead to an accept state • For strings not in the language:

  26. Recognition in NFA • For strings in the language: • There exists some state sequence leading to a final state • Not all paths necessarily lead to an accept state • For strings not in the language: • No paths lead to an accept state

  27. NFA Recognition as Search • Search problem:

  28. NFA Recognition as Search • Search problem: • Start search-state • (start node, beginning of input)

  29. NFA Recognition as Search • Search problem: • Start search-state • (start node, beginning of input) • Goal test • (final node, end of input)

  30. NFA Recognition as Search • Search problem: • Start search-state • (start node, beginning of input) • Goal test • (final node, end of input) • Successor function • Transitions

  31. NFA Recognition as Search • Search problem: • Start search-state • (start node, beginning of input) • Goal test • (final node, end of input) • Successor function • Transitions • Path cost

  32. NFA Recognition as Search • Search problem: • Start search-state • (start node, beginning of input) • Goal test • (final node, end of input) • Successor function • Transitions • Path cost • Use standard search algorithms • e.g. Depth-first, breadth first, A*

  33. Example

  34. Example

  35. Example

  36. Example

  37. Example

  38. Example

  39. Example

  40. Example

  41. Breadth-first Search

  42. Equivalences and Operations • FSAs equivalent to Regular Languages (Regex, etc) • Can be shown by induction • Demonstrate primitive regex ops imitated by automata

  43. Equivalences and Operations • FSAs equivalent to Regular Languages (Regex, etc) • Can be shown by induction • Demonstrate primitive regex ops imitated by automata

  44. Equivalences and Operations • FSAs equivalent to Regular Languages (Regex, etc) • Can be shown by induction • Demonstrate primitive regex ops imitated by automata

  45. Equivalences and Operations • FSAs equivalent to Regular Languages (Regex, etc) • Can be shown by induction • Demonstrate primitive regex ops imitated by automata

  46. Equivalences and Operations • FSAs equivalent to Regular Languages (Regex, etc) • Can be shown by induction • Demonstrate primitive regex ops imitated by automata • E.g. Concatenation

  47. Equivalences and Operations • FSAs equivalent to Regular Languages (Regex, etc) • Can be shown by induction • Demonstrate primitive regex ops imitated by automata • E.g. Concatenation

  48. Equivalences and Operations • Closure: • Union:

More Related