Definition talk:Turing Machine

From ProofWiki
Jump to navigation Jump to search

I've seen definitions like the one here in a few books, and it gets across the idea, but we might additionally want a more formal definition on this page that matches up with the description here. I'm bringing this up because the new Existence_of_Uncomputable_Functions page's proof is apparently hoping to link to a result that will essentially be a redefinition of Turing Machines that makes that proof easier.

So, question: do we want a formal definition on this page, or (in view of the fact that there's lots of things which are essentially equivalent to turing machines and could be called by the same name) do we want to have separate pages formally defining objects and justifying using them when we say "Turing machine" by verifying that they fit the informal description here?

Also, I feel like I should question the use of the word "algorithm(ic)" in the definition of a Turing machine (even informal), since part of the point of studying them is to get a formalized idea and understanding of algorithms. :D

-- Qedetc 17:57, 15 June 2011 (CDT)

Be a nice-to-have, perhaps (in that it leads us to the concept of "Turing-complete" computer languages (or whatever that malarkey is all about) but we already have a fairly comprehensive analysis of computability from the point of view of an Unlimited Register Machine. All we need to do is prove that a URM can be emulated by a Turing machine and we've done all we need. I do not think it's worth recreating the entire sequence of results from the point of view of a TM.
Besides, that Existence of Uncomputable Functions page has already been deleted. --prime mover 00:16, 16 June 2011 (CDT)
Ah, I see. I hadn't poked around much. With all the register machine stuff here it definitely seems like it would (if nothing else) be an inefficient way to spend time if one were to redo it all with Turing machines. -- Qedetc 00:41, 16 June 2011 (CDT)

Formal definition

Is the following formal definition acceptable?

A (deterministic) Turing machine is an ordered tuple

$\struct {S, I, \Sigma, b, T}$

where:

  • $S$ is the set of states;
  • $I \in S$ is the initial state;
  • $\Sigma$ is the alphabet, the set of symbols permissible on the tape;
  • $b \in \Sigma$ is the blank symbol, the symbol which every cell on the tape is initialized to at the start;
  • $T : S \times \Sigma \to S \times \paren {\Sigma \cup \set {L, R} }$ (where $L, R \notin \Sigma$) is the transition function:
    • If $\map T {s, \sigma} = \tuple {s', L}$ for some $s'$, the head moves one cell to the left.
    • If $\map T {s, \sigma} = \tuple {s', R}$ for some $s'$, the head moves one cell to the right.
    • If $\map T {s, \sigma} = \tuple {s', \sigma'}$ for some $s'$, the symbol in the current cell is replaced with $\sigma'$.
    • In all three cases, the current state of the machine is set to $s'$.

This definition follows the informal description on the page. Personally, I learnt it differently. In my course, the transition function returned both a new symbol and a direction to move into, i.e. $T : S \times \Sigma \to S \times \Sigma \times \set {L, R}$. — Timwi (talk) 14:29, 20 December 2012 (UTC)

In Arto Salomaa's 1973 book in $\S 4$ he defines it using a semi-Thue system allowing stationary motion of the head (what he termed overprinting). In Keijo Ruohonen's 2009 summary he defined it in $\S 6.1$ as a septuple in terms of his definition of an LBA (he also explicitly stated that the definition we are operating with was that of a deterministic Turing machine). I think we should go with your definition because it is closest to John C. Martin's 2010 introduction to the field (Definition $9.1$). It is contemporary, ideal for an introduction and is simpler (being a quintuple). The only modification I would make is $\set {L, S, R}$ ($S$ for stationary) instead of $\set {L, R}$. Just my opinion :) --Jshflynn (talk) 18:26, 20 December 2012 (UTC)
Excellent presentation. Suggest what we want to do here is have two sections "informal definition" and "formal definition". Really needs two subpages transcluded into this one as per normal - but I can do the refactoring once the page is up according to what you think it ought to say. Quote your source(s) carefully as there are several different treatments of this subject and there may be several different styles of presentation - we may want to include more than one, and it's a nice-to-have to be able to cite the source accurately and reliably. --prime mover (talk) 22:12, 20 December 2012 (UTC)
As for: "In my course, the transition function returned both a new symbol and a direction to move into, i.e. $T : S \times \Sigma \to S \times \Sigma \times \set {L, R}$" yes, that looks okay to me, the informal presentation is just that - informal and possibly inaccurate / incomplete. --prime mover (talk) 22:22, 20 December 2012 (UTC)

Cited definition according to published work

We need to be careful here.

Every definition on $\mathsf{Pr} \infty \mathsf{fWiki}$ (particularly one as detailed and specialised as this one) has to be supported by a published source work.

From what I glean from the comments, recent additions and changes to this page may have been based on an intuitive idea and what it says on Wikipedia.

Before the recent edit (which added $N$ into the definition of the codomain of $\delta$), what we had matched what is in Hopcroft and Ullman (at least I am assuming that, I have the 1st edition in front of me, and I'm assuming the 3rd edition hasn't changed basic stuff like that). Now what we have does not match Hopcroft and Ullman.

If we want to implement a different definition of a Turing machine, then we need to define it as a variant, which will be a prodigious amount of work.

My recommendation (which if I had the authority I would reclassify as an instruction) is to present the definition of the Turing machine exactly as (or at least definably isomorphic to) the definition in Hopcroft and Ullman, and develop the theory based on that.

If it really really is necessary to bring in that $N$ parameter, then we find a hardcopy source which includes that $N$ parameter, and implement it in $\mathsf{Pr} \infty \mathsf{fWiki}$ as a design variant via a transcluded subpage.

(It's clear what $N$ is, it's a do-nothing instruction -- but as such an instruction is not supported in H&U's treatment, it is apparently not necessary.) --prime mover (talk) 07:15, 16 February 2023 (UTC)

The problem is that in this field, there are dozens of different, but equivalent, definitions of computation. For example, H&U allows a doubly-infinite tape, which is convenient when it is represented as a pair of stacks during emulation. But, some variations only allow the tape to be infinite in one direction. It is easy to prove that they are equivalent, by interleaving the left and right halves of the tape in the infinite direction. But do we want to include that as a separate variant? What about other features, such as: bounding the number of states or symbols, allowing a neutral direction (what I put here), permitting multiple tapes or heads, disabling writing, disallowing input, limiting the number of accepting states, preventing transitions out of an accepting state? The Church-Turing Thesis implies that, as long as we don't do anything dumb like bounding states and symbols at the same time (then there's finitely many distinct machines), we'll always end up with an equivalent system.
In this case, I'm fine with this change being reverted. It's a convenience; definitely not necessary. But having a system powerful enough to talk about the concepts we want to study is important. One major downfall of the H&U system is that spaces have no identity. We can't even talk about a tape being "right-infinite." In this case, we can bypass the problem by requiring that the first symbol is never $B$ after the first move, but it might not be so easy in other . --CircuitCraft (talk) 19:52, 16 February 2023 (UTC)
The decision to run with H&U was yours. If you can find a hard copy work with a better implementation, then you are invited to use that, if it saves effort. Of course, it will then be necessary to demonstrate that all the implementations of Turing machine thus analysed are equivalent. --prime mover (talk) 20:01, 16 February 2023 (UTC)