1 / 32

Automata, Grammars and Languages

This article provides an overview of Turing Machines (TM), their basic features, and their applications in computer science. It explains the components of a TM, its acceptance and recognition capabilities, and its relationship with other models of computation such as RAM. The article also discusses the concept of TM extensions and Church's Thesis.

bwisniewski
Télécharger la présentation

Automata, Grammars and Languages

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Automata, Grammars and Languages Discourse 05 Turing Machines C SC 473 Automata, Grammars & Languages

  2. b The TM Model current state 0 1 2 3  Unbounded blank tape    • Model of ``effectively computable procedure” (including all algorithms) • Primitive as possible (math. simplicity & constructions) • Arbitrary definition choice, but standard • Easy comparison with other models (RAM, etc.) • Finitely describable (just a formatted string) • 1 machine = 1 procedure (no stored program -- different FSA for each TM) • Computes in discrete steps (moves); each step physically trivial • Simple complexity measures (steps=time, cells=space) ⊔ ⊔ Fixed # bits R/W head C SC 473 Automata, Grammars & Languages

  3. b {⊔} The TM Model 0 1 2 3     • Control unit: FSA with state set Q • Input: character in cell under R/W head • Output: new state & overstrike character & head moves L or R • Initial state, accept state, reject state: • Tape unit • Finite length string of characters with blanks ⊔⊔⊔⊔⋯ to right • ⊔ marks R end of input • Fixed alphabet  ( ⊔  ); input alphabet • Tape bounded on left & unbounded on right  Finite- state control C SC 473 Automata, Grammars & Languages

  4. TM Examples input and R/W head left adjusted at start • TM to scan R over non-blanks to halt over first blank • TM to accept odd parity # 1’s 0 0, R ⊔ ⊔, L  1 1, R 0 0, R 1 1, R 0 0, R 1 1, R ⊔ ⊔, L ⊔ ⊔, L C SC 473 Automata, Grammars & Languages

  5. TM Definition (1-tape deterministic TM) • A 7-tuple • Q = finite set of states •  = finite tape alphabet; blank ⊔;input alphabet • Transition function • initial state ; halt (accept/reject) states: • Configuration • (1-step) yields relation between configurations    ⊔  Note: Below says you can move left from leftmost cell, although you don’t really move left! This convention makes accept, reject the only halt states.   C SC 473 Automata, Grammars & Languages

  6. Acceptance • Move relation (multi-step): • w* is accepted by M  • Language recognized by M: • TM-recognizable language – a set such that • Configuration I is halting: • Nowhere to move: from definition, only2 ways to halt: • A rejecting or accepting configuration: • A configuration for which no move is defined • Initial configuration: • Any halting configuration without is non-accepting • Can always “force” non-accepting, halting configs to be rejecting • For undefined just add transitions to * * *  C SC 473 Automata, Grammars & Languages

  7. yes yes Recognizer for L Decider for L w w no Turing-decidable vs. Turing-recognizable • A language L is Turing-recognizableiff for some TM M . • M need not, but might, halt rejecting on some strings  L • Such a TM is an acceptor or recognizer for L • A language L is Turing-decidable iff for some TM M that halts on every input string (accepts or rejects). Such a TM is called a decider or algorithm. “procedure” “algorithm” Note: ``yes’’ means enters and halts ``no’’ means enters and halts C SC 473 Automata, Grammars & Languages

  8. Example BB, L CC, L bb, R CC, R • Acceptor for aa, R BB, R a A,R cC, L bB, R B B, R AA, L aa, L bb, L cc, L ⊔ ⊔,L A A, R B B, R C C, R C C, R All transitions not shown go to (See dotted example) So halts rejecting unless accepting statereached ⊔ ⊔,L C SC 473 Automata, Grammars & Languages

  9.    Acceptance (Cont.) • Sequence of configurations is a computation of length (time) t • Computation is in a tight loop iff (a config. repeats) (1) How does a TM accept w? (2) How does a TM reject w? • in a tight loop • a halting & rejecting config • computation never reaches a halting config., and no configuration repeats. Call this a divergent computation. NOTE: Can make (1) coincide with halting. Can eliminate (a) in favor of (b). (How?) But there is no algorithm to detect (c ) and eliminate (as we shall see)    + * *   *   C SC 473 Automata, Grammars & Languages

  10. Acceptance (Cont.) • Recognizers and Deciders – summary • M recognizes • M decides • recognizes • rejects (& halts) • S is a Turing-recognizable language if there is some TM that recognizes it • S is a Decidable language iff there is some TM that decides it • NOTE: every TM recognizes some language, but only “special” TMs—ones that halt for every input—decide the language they recognize M is called a decider or algotithm C SC 473 Automata, Grammars & Languages

  11. A Look Ahead sets Decidable  C Turing- Recognizable Context-free  B Regular  A  D    E C SC 473 Automata, Grammars & Languages

  12. TM Extensions: Church’s Thesis • Evidence for Church’s Thesis • Church’s Thesis (Church-Turing Thesis): the effectively computable functions are those characterized by one the “standard formalisms” such as the TM. • Convenience of having alternate models • Two models of computation are equivalent  • whenever a language L is recognized by a machine in one model, there is an algorithm to construct a machine recognizing L in the second, and vice versa • whenever a function f is computed by a machine in one model, there is an algorithm to construct a machine computing f in the second, and vice versa • I.e., Simulation both directions by a “compiling” algorithm C SC 473 Automata, Grammars & Languages

  13. a b c d e f B . . . A B C D E F G . . . a A TM Extensions: k-tape TM • Tapes simulated by one tape with k tracks & software heads • a in cell iof tape 1  a on track 1 of cell i • a in cell iof tape 1 & head 1 scanning cell  on track 1 of cell i • Tape alphabet is head sweeps L to R until  store k scanned chars. In states. Sweep R to L & mark changes. Sweep L to R then R to L to move head marks. End cycle scanning cell 0. B B 0 1 2 3 … 0 1 2 3 … b d e f B . . . B C D E G . . . C SC 473 Automata, Grammars & Languages

  14. Nondeterministic TM • A 7-tuple • Transition function • Transition function • For each (q,a),  (q,a) provides a set of choices • Acceptance means: sequence of configurations leads to a configuration with state . • No use for reject state (rejection is not by entering a state) • Rejection means no possible chain of configurations leads to one with . • No next move is possible: can have C SC 473 Automata, Grammars & Languages

  15. Nondeterministic TM • Acceptance: • note the existential quantifier: w is accepted if there is some sequence of “guesses” that drive the initial tape configuration to an accepting configuration • A word w is not accepted iff every possible computation starting with fails to enter an accepting configuration    C SC 473 Automata, Grammars & Languages

  16. Choice Numbers • Choices for a given state and symbol • For each q and a assign each choice in (q,a) a number • For each machine, there is some largest number of choices for some transition–call it b • Strings over can be interpreted as deterministic directions for which choice to make from each configuration C SC 473 Automata, Grammars & Languages

  17. Computation Tree Choice sequence    C SC 473 Automata, Grammars & Languages

  18. NTM Equivalent to TM • Theorem 3.16. If L is accepted by an NTM N , there is a DTM D, constructable by algorithm, that accepts L. Proof: Simulate all possible computations on N for all possible choice strings and halt if a sequence of choices is found that leads to an accepting configuration. D has 3 tapes: read-only input, worktape simulating that of N, and an enumeration tape. On the latter, enumerate all choice sequences in lexicographic order Main cycle: generate the next choice seq. c on the enum. Tape. Use this sequence to drive computation of N for a total of |c| steps. If an accepting configuration is reached, D accepts and halts. If the computation halts rejecting, move to the next main cycle after clearing the worktape. Otherwise move to the next main cycle after |c| steps. Iterate the main cycle while an accepting configuration has not been found. If an accepting c exists, D will eventually find it. If none exists, the input will not be accepted. So L(D) = L(N). QED C SC 473 Automata, Grammars & Languages

  19. The Universal TM • Any (hardware) TM M can be encoded as a formatted string (software) • Encoding details below • The UTM U readsand simulates the action of M on w • The UTM U is one, fixed, finite machine, capable of simulating any TM (an interpreter) • For example, U reads and gives the same result as for input • We shall see that, whenever universality exists, unsolvability is an inevitable consequence C SC 473 Automata, Grammars & Languages

  20. Canonical Encoding of TM M • Let • Encode over 9 symbol alphabet object in Mencoding in C SC 473 Automata, Grammars & Languages

  21. Canonical Encoding of TM M (cont.) • is a word over • Final encoding is ASCII (binary) of that string • Notational convention: an encoding of a TM M ``is” (the binary representation of) a natural number • number is called the Gödel number of M • Conversely if e is a natural number, is the TM with that Gödel number • If e is a syntactically invalid code, is by definition a TM that halts and prints 0 on every input C SC 473 Automata, Grammars & Languages

  22. UTM Construction • Use 4 tapes: input program tape, state tape w & worktape of M being simulated • Parse input, copying to program tape. If invalid, put [0] on worktape and halt. Else copy w to worktape & put [0] on state tape simulating state • While the state tape [1] do { • If the state tape contains count over totuples for that state in program Use the character a under scan on the worktape ([0],[1],…) to scan to the correct 5-tuple • Overwrite the scanned character c(a) on the worktape by c(b) and move the head direction D on the worktape. • Copy the new state string from the program tape to the state tape (after erasing the previous state string) } C SC 473 Automata, Grammars & Languages

  23. UTM Construction • 4 tapes: input to U, tape to hold , tape to hold state of and worktape of . . . read-only input read-only program [ 0 ] B B B . . . [ 0 ] [ 1 ] . . . state of M worktape of M C SC 473 Automata, Grammars & Languages

  24. 2-State, 3-Tape Symbol UTM • http://www.wolframscience.com/prizes/tm23/ C SC 473 Automata, Grammars & Languages

  25. TM Extensions: Enumerators • TM that prints out a list of strings, separated by blanks • Starts with an empty tape • Defines a set of strings (language) • Strings need not be unique (repeated strings allowed) • Strings can be enumerated in no particular order • Usually does not halt; runs on forever printing strings • Finite or not, an enumerator E defines a set of strings Theorem 3.21: A language is Turing-recognizable iff it is enumerable by some enumerator. Pf: Must show 2 directions: a language defined by an enumerator has a Turing recognizer, and any TM-recognizable language has an enumerator. We will show there are algorithms for going in both directions. C SC 473 Automata, Grammars & Languages

  26. TM Extensions: Enumerators (cont’d) • Enumerator  Recognizer. Let A = L(E) where E is an enumerator. Using the code for E construct a recognizer M as follows. • M = “On input w: • Continue running E and output the next string s. Pause E. Compare s to w. • If w = s, halt and accept. Otherwise, continue at step 1.” • Machine M recognizes just those strings A that E enumerates. C SC 473 Automata, Grammars & Languages

  27. TM Extensions: Enumerators (cont’d) • Recognizer  Enumerator. Let A = L(M) where M is a TM recognizer. Using the code for M construct an enumerator E as follows. Let be a list of all strings in *. (It is easy to build a routine that generates strings in lexical order: ) • E = “Ignore the input. • for i = 1 to  do { • Run M for i steps on each of the inputs • If any computation sequence accepts an input print it out. } “ • E(eventually) enumerates every string M accepts • Technique: “interleaving computations” C SC 473 Automata, Grammars & Languages

  28. What is ``Effectively Computable’’? • Early 1900s: logicians sought a universal algorithm that would enable “Automatic Theorem-Proving” • Worked to find “deciders” to tell: • whether formulas like are logically true via a “mechanical” method (Decision Problem for First Order Logic) • whether multi-variable polynomial equations like have integer solutions (Hilbert’s 10th Problem) • Want to decide any such questions by ‘computation’ that is ‘effectively performable’ with a finite number of computation steps • Had no idea there were “undecidable” problems C SC 473 Automata, Grammars & Languages

  29. ``Effectively Computable’’ (cont’d) • But what did “effectively computable” mean? • An intuitive notion: ‘you know it when you see it’ • There were no programming languages or computers • Each author had to pick a “mechanically performable” method to illustrate what was computable • 1936: Alan Turing proposed Turing Machines • 1936: Alonzo Church proposed -calculus • 2 remarkable outcomes: • Both definition methods proved to be equivalent (agreed on what was a computable set or function) • Once a method was picked, it was discovered that there must be undecidable sets; • Eventually many more methods proved equivalent: C SC 473 Automata, Grammars & Languages

  30. Computability: The Evidence for Church-Turing’s Thesis Gödel & Herbrand (1934) Kleene (1936) General Recursive Functions Turing (1936) TM Church(1936)-Calculus Markov (1954) Markov Algorithms TM-computable Functions / TM-recognizable Sets Phrase-structure (Type 0) Grammar(1959) Shepherdson & Sturgis (1963) Register Machines(RAM) Any programming language to date Post (1943) Canonical Systems Elgot & Robinson (1964) RASPs For each pair of representations, there is a uniform algorithm (“compiler”) for translating one to the other C SC 473 Automata, Grammars & Languages

  31. Intuitive concept Turing-machine of algorithm algorithm Intuitive concept RAM of algorithm algorithm Intuitive concept Markov of algorithm algorithm Church-Turing Thesis • All methods also agree on what is a “decidable set” and what is a “recognizable set” Thesis [OED] A proposition laid down or stated, esp. as a theme to be discussed and proved, or to be maintained against attack ; a statement, assertion, tenet. C SC 473 Automata, Grammars & Languages

  32. ``Effectively Computable’’ (cont’d) • Both decision problems (Hilberts 10th, 1st Order Predicate Calculus) eventually shown to be undecidable • In our terminology, a decisionproblem is translated into a language. The problem is decidable iff the language has a Turing decider. • None of the following languages have a decider; they are undecidable languages. • This last is called the Halting or Membership Problem. C SC 473 Automata, Grammars & Languages

More Related