Alan turing church thesis

Deutsch realized that this observation leads to a very interesting possibility, that of deducing the Church-Turing thesis from the laws of physics. This line of thought led Deutsch to propose a revision of the Church-Turing thesis, which we are calling the Church-Turing-Deutsch Principle. Just to restate it in the terms used above, the CTD Principle says that every physical process can be simulated by a universal computing device. Is it a Turing machine?

Turing Machines

Is it something else? Deutsch goes further, and talks about what would be a suitable universal computing device. This means that given any physical theory we can ask whether or not the CTD Principle is true in that particular theory? That is, within the scope of that theory is it possible to construct a universal computing device that can simulate an arbitrary physical system, also within that theory? Fredkin and Toffoli showed that such a model could simulate conventional computers like those Turing had proposed. In turn, it is widely believed that Turing machines are sufficient to simulate all of Newtonian physics.

Some detailed investigations of this question have been done by Pour-El and Richards; with some caveats, my understanding is that their work shows that Turing machines can simulate all of Newtonian physics. I should say that my understanding of their work is rather superficial, and I have not worked through all the relevant results in detail. It seems relatively straightforward to construct a universal device according to the same general principles, although I am not aware of any published work describing such a construction.

Of course, Newtonian physics is wrong. It seems to me to be an extremely interesting question to determine whether or not the correct ultimate physical theory obeys the CTD Principle of not. Of course, we do have some good physical theories right now that will probably be incorporated into any hypothetical ultimate physical theory.

It is a natural question whether or not such theories obey the CTD Principle. For example, one might as whether or not it is possible to develop a model computer within quantum electrodynamics that is capable of simulating any quantum electrodynamical process. One might ask a similar question with respect to the standard model of particle physics, which incorporates electromagnetism as well as the strong and weak nuclear forces. We can even ask the question of whether the CTD Principle is satisfied in other more exotic theories, such as string theory or spin networks, which some people believe may serve as a basis for an ultimate physical theory.

Main Page Navigation

First, however, I want to digress to talk about the status of the CTD Principle as a physical principle, and what some consequences might be. The Church-Turing-Deutsch Principle as a guide to the construction of new physical theories. For a physicist it is very natural to regard the CTD Principle as something that ought to be true in any reasonable physical theory.

After all, the CTD Principle amounts to the belief that a limited physical system, like the human brain, ought to be capable of simulating the behaviour of an arbitrary physical system, at least in principle. This is a very appealing proposition to most scientists, and certainly, to most physicists, in part because it agrees with our prejudices — we would like the world to be comprehensible — and in part because all our experience in modeling real-world phenomena suggests that it is likely correct.

This suggests tipping the CTD Principle on its head and actually using it as a guideline for the construction of new physical theories. Let me give an example in another domain where this strategy has been used: the Principle of Conservation of Energy. This Principle was originally formulated in the context of Newtonian physics, yet it survived intact through special and general relativity, quantum mechanics, and is likely to survive in future theories.

Although the exact mathematical expression of the Principle has changed from theory to theory, it remains a powerful general principle governing the construction of new theories. Indeed, if a theory is proposed that does not obey some sort of conservation of energy law, physicists are unlikely to take it terribly seriously unless truly compelling reasons are given. This is not an isolated example. Other principles such as the principles of relativity, gauge invariance and renormalizability, originally formulated in other contexts, have been extremely powerful guiding principles in the development of the standard model of physics, and continue to be powerful guiding principles in the construction of new physical theories.

Taking such principles as a basis for the construction of new physical theories is not, of course, entirely scientific.

Church–Turing thesis

So physicists are very keen to identify deep principles that they believe are likely to be correct, and which have real physical consequence. Guiding principles of this type are often rather vague, for if a principle is to be true in an as-yet-undiscovered physical theory, then of necessity it will not be possible to achieve full mathematical precision in the formulation of the principle.

This is reflected in the formulation of the CTD Principle that I have given, with many of the key terms such as simulation and universal computing device not being precisely defined. Of course, this is not to say that we do not expect instantiations of such principles in specific physical theories to be precise. The Principle of Conservation of Energy has a perfectly precise technical meaning within quantum mechanics.

It has another precise technical meaning within general relativity. These technical meanings are quite different, yet both are recognizably instantiations of a more general, albeit rather more vague, Principle of Conservation of Energy. What would it buy us to regard the correctness or otherwise of the CTD Principle in a given physical theory as a criterion for whether that physical theory is correct or incorrect?

I should note, by the way, that closely related points of view have been put forward many times over the past few decades. Perhaps the most extreme are people like Ed Fredkin, who have proposed that the Universe is really a giant cellular automata, and that the laws of physics should therefore be formulated as rules for such a cellular automata. This is a particularly extreme form, for it presupposes that we already know what the candidate universal computing device in the CTD Principle is.

Less extreme views have been been put forward by other people. If I recall correctly, both Wheeler and Landauer have made arguments along these lines. But do we really mean, or need, exact solutions? Would it count if we could just find arbitrarily close approximations to the solutions? This would be ordinary convergence of the simulations to the true solutions.

Would this still be a useful version of the CTD? It seems to be rather difficult to address convincingly because while everyone understands what an exact solution is, making precise the notion of an approximate solution is far from trivial. Along the lines of the previous post, one could also consider the existence of simulations with linear time and space overheads.

If one could prove a super-linear lower bound here, what would the implications be for our theories of quantum mechanics? Yes, this was well-done and interesting. However, it seems to me that the homunculus problem of modern physics pervades the entire discussion. And that is … does physics describe reality or does it describe our formulations of reality. Here is your problem: George Bush and people of faith.

All physics formalisms amount to these people is the boggie man. He thought his arguments I and II, and indeed "[a]ll arguments which can be given" for the thesis, are "fundamentally, appeals to intuition, and for this reason rather unsatisfactory mathematically. Much the same can be said about argument I. If axioms 1—5 are formulated in precise mathematical terms, then it is certainly provable from them that computation is bounded by Turing computability; this is probably what Gandy 20 meant when he said Turing's argument I proves a "theorem.

The axioms may be supported by informal arguments, but the whole edifice then falls short of mathematical proof. This is most apparent when the informal arguments offered for the axioms invoke limitations in the cognitive capacities of human computers, as we point out elsewhere. The axioms most certainly lie beyond the scope of mathematical demonstration if their truth depends on contingent human limitations.

Turing himself cheerfully appealed to cognitive limitations in the course of his analysis, saying, for example, "[j]ustification lies in the fact that the human memory is necessarily limited. The various historical arguments seem more than sufficient to establish CTT-O, but these arguments do indeed fall short of mathematical proof. We next address complexity theoretic forms of the Church-Turing thesis, then turn to the question of whether CTT-A is justified in the context of physically realistic computations. It is striking that the Turing machine holds a central place not only in computability theory but also in complexity theory, where it is viewed as a universal model for complexity classes.

In complexity theory, the time complexities of any two general and reasonable models of computation are assumed to be polynomially related. But what counts as "reasonable"? Aharonov and Vazirani 1 glossover "reasonable" as "physically realizable in principle"; see also Bernstein and Vazirani.

Quantum-computation researchers also use a variant of this thesis, as expressed in terms of probabilistic Turing machines. Bernstein and Vazirani 3 said: "[C]omputational complexity theory rests upon a modern strengthening of [the Church-Turing] thesis, which asserts that any 'reasonable' model of computation can be efficiently simulated on a probabilistic Turing machine.


  • The Church-Turing Thesis and Relative Recursion - Yiannis Moschovakis | Arend Heyting Stichting.
  • Learn more about turing thesis?
  • The Halting Problem and the Church-Turing Thesis?
  • diversity at the workplace essay!
  • article dobson dr essay family james matter.
  • Church turing thesis quantum;
  • The Church-Turing Thesis: Logical Limit or Breachable Barrier?.

Aharonov and Vazirani 1 give the following formulation of this assumption, naming it the "Extended Church-Turing thesis"—though it is not quite the same as Yao's earlier thesis of the same name, which did not refer to probabilistic Turing machines:. As is well known in computer science, Peter Shor's quantum algorithm for prime factorization is a potential counterexample to CTT-E; the algorithm runs on a quantum computer in polynomial time and is much faster than the most-efficient known "classical" algorithm for the task.

But the counterexample is controversial. Some computer scientists think the quantum computer invoked is not a physically reasonable model of computation, while others think accommodating these results might require further modifications to complexity theory. The issue of whether every aspect of the physical world is Turing-computable was broached by several authors in the s and s, and the topic rose to prominence in the mids. In , Stephen Wolfram formulated a thesis he described as "a physical form of the Church-Turing hypothesis," saying, "[U]niversal computers are as powerful in their computational capacities as any physically realizable system can be, so that they can simulate any physical system.

Every finite physical system can be simulated to any specified degree of accuracy by a universal Turing machine. Deutsch pointed out that if "simulated" is understood as "perfectly simulated," then the thesis is falsified by continuous classical systems, since such classical systems necessarily involve uncomputable real numbers, and went on to introduce the concept of a universal quantum computer, saying such a computer is "capable of perfectly simulating every finite, realizable physical system.

We next formulate a strong version of the physical Church-Turing thesis we call the "total physical computability thesis. By "physical system" we mean any system whose behavior is in accordance with the actual laws of physics, including non-actual and idealized systems.

Total physical computability thesis CTT-P. Every physical aspect of the behavior of any physical system can be calculated to any specified degree of accuracy by a universal Turing machine. Nevertheless, in our "CTT-" nomenclature, we follow the Deutsch-Wolfram tradition throughout this article. Is CTT-P true? Not if physical systems include systems capable of producing unboundedly many digits of a random binary sequence; Church showed such sequences are uncomputable, as we discussed elsewhere.

First introduced by Pitowsky, 32 they will be examined in the section called "Relativistic Computation. There are a number of theoretical countermodels to CTT-P arising from quantum mechanics. For example, in , Komar 26 raised "the issue of the macroscopic distinguishability of quantum states," asserting there is no effective procedure "for determining whether two arbitrarily given physical states can be superposed to show interference effects. Other typical physical problems take the same form; Pitowsky gave as examples "Is the solar system stable?

Cubitt et al. The spectral gap, an important determinant of a material's properties, refers to the energy spectrum immediately above the ground-energy level of a quantum many-body system, assuming a well-defined least-energy level of the system exists; the system is said to be "gapless" if this spectrum is continuous and "gapped" if there is a well-defined next-least energy level. The spectral gap problem for a quantum many-body system is the problem of determining whether the system is gapped or gapless, given the finite matrices at most three describing the local interactions of the system.

In their proof, Cubitt et al. The proof involves an infinite family of two-dimensional lattices of atoms. But they pointed out their result also applies to finite systems whose size increases, saying, "Not only can the lattice size at which the system switches from gapless to gapped be arbitrarily large, the threshold at which this transition occurs is uncomputable. There exists no effective method for extrapolating the system's future behavior from complete descriptions of its current and past states.

It is debatable whether any of these quantum models correspond to real-world quantum systems. Nevertheless, these results are certainly suggestive: CTT-P cannot be taken for granted, even in a finite quantum universe.

Summarizing the current situation with respect to CTT-P, we can say, although theoretical countermodels in which CTT-P is false have been described, there is at present—so far as we know—not a shred of evidence that CTT-P is false in the actual universe. Yet it would seem most premature to assert that CTT-P is true. Piccinini 31 has distinguished between two different types of physical versions of the Church-Turing thesis, both commonly found in the literature, describing them as "bold" and "modest" versions of the thesis, respectively.

The bold and modest versions are weaker than our "super-bold" version just discussed CTT-P. Bold versions of the thesis state, roughly, that "Any physical process can be simulated by some Turing machine. CTDW and other bold forms are too weak to rule out the uncomputability scenarios described by Cubitt et al. The situation is similar in the case of the universal Turing machine itself. Nevertheless, bold forms such as CTDW are interesting empirical hypotheses in their own right and the world might confute them.

For instance, CTDW fails in the wave-equation countermodel due to Pour-El and Richards 34 where the mapping between the wave equation's "inputs" and "outputs" is not a Turing-computable real function; although, as noted earlier, the physicality of this countermodel can readily be challenged. We discuss some other potential countermodels later in the article, but turn first to what Piccinini termed "modest" versions of the thesis. Modest versions maintain in essence that every physical computing process is Turing-computable; for two detailed formulations, see Gandy 20 and Copeland.

An illustration of the difference between modest versions on the one hand and CTT-P and CTDW on the other is given by the fact that the wave-equation example is not a counter-model to the modest thesis, assuming, as seems reasonable, that the physical dynamics described by the equation do not constitute a computing process. Here, we formulate a modest version of the physical Church-Turing thesis we call the "Physical Computation" thesis, then turn to the question of whether it is true.

Every function computed by any physical computing system is Turing-computable. As with the stronger physical computability theses, it seems too early to say. If, however, the door is open to a broadened sense of computation, where physical computation is not necessarily effective in the sense of being bounded by Turing-computability, then CTT-P-C makes a substantive claim. There is, in fact, heated debate among computer scientists and philosophers about what counts as physical computation. Moreover, a number of attempts have sought to describe a broadened sense of computation in which computation is not bounded by Turing-computability; see, for example, Copeland.

These "Zeno computers" squeeze infinitely many computational steps into a finite span of time. Examples include accelerating machines, 7 , 12 shrinking machines, and the intriguing relativistic computers described in the next section. Notional machines all constitute rather theoretical countermodels to CTT-P-C, so long as it is agreed that they compute in a broadened sense, but none has been shown to be physically realistic, although, as we explain, relativistic computers come close.

Relativistic machines operate in space-time structures with the property that the entire endless lifetime of one component of the machine is included in the finite chronological past of another component, called "the observer. Such machines are in accord with Einstein's general theory of relativity, hence the term "relativistic.

In this section we outline a relativistic machine RM consisting of a pair of communicating Turing machines, T E and T o , in relative motion. T E is a universal machine, and T o is the observer. RM is able to compute the halting function, in a broad sense of computation. Speaking of computation here seems appropriate, since RM consists of nothing but two communicating Turing machines. Here is how RM works. When the input m,n , asking whether the m th Turing machine in some enumeration of the Turing machines halts or not when started on input n , enters T o , T o first prints 0 meaning "never halts" in its designated output cell and then transmits m,n to T E.

T E simulates the computation performed by the m th Turing machine when started on input n and sends a signal back to T o if and only if the simulation terminates. If T o receives a signal from T E , T o deletes the 0 it previously wrote in its output cell and writes 1 in its place meaning "halts". After one hour, T o 's output cell shows 1 if the m th Turing machine halts on input n and shows 0 if the m th machine does not halt on n.

T E , an ordinary computer, remains on Earth, while the observer T o travels toward and enters a slowly rotating Kerr black hole. T o approaches the outer event horizon, a bubble-like hypersurface surrounding the black hole. T o motion proceeds until, relative to a time t on T o clock, the entire span of T E 's computing is over. If any signal was emitted by T E , the signal will have been received by T o before time t.

So T o will fall into the black hole with 1 in its output cell if T E halted and 0 if T E never halted. T o thus emerges unscathed from the hole and goes on to use the computed value of the halting function in further computations.

Quantum computation: From Church-Turing thesis to Qubits - IEEE Conference Publication

We next reconsider CTT-A. The continuing expansion of the concept of an algorithm is akin to the extension of the concept of number from integers to signed integers to rational, real, and complex numbers. Even the concept of human computation underwent an expansion; before , computation was conceived of in terms of total functions, and it was Kleene in who explicitly extended the conception to also cover partial functions.

Gurevich argued in that formal methods cannot capture the algorithm concept in its full generality due to the concept's open-ended nature; at best, formal methods provide treatments of "strata of algorithms" that "have matured enough to support rigorous definitions. Perhaps not.

In , Jon Doyle 18 suggested equilibrating systems with discrete spectra such as molecules and other quantum many-body systems illustrate a concept of effectiveness that is broader than the classical concept, saying, "[E]quilibrating can be so easily, reproducibly, and mindlessly accomplished" that we may "take the operation of equilibrating as an effective one," even if "the functions computable in principle given Turing's operations and equilibrating include non-recursive functions. Over the years, there have been several departures from Turing's analysis, as the needs of computer science led to a broadening of the algorithm concept.

For example, Turing's fourth axiom, which bounds the number of parts of a system that can be changed simultaneously, became irrelevant when the algorithm concept broadened to cover parallel computations. The future computational landscape might conceivably include more extensive revisions of the concept, if, for example, physicists were to discover that hardware effective in Doyle's extended sense is a realistic possibility. If such hardware were to be developed—hardware in which operations are effective in the sense of being "easily, reproducibly, and mindlessly accomplished" but not bounded by Turing computability—then would the appropriate response by computer scientists be to free the algorithm concept from CTT-A?

Or should CTT-A remain as a constraint on algorithms, with instead two different species of computation being recognized, called, say, algorithmic computation and non-algorithmic computation? Not much rides on a word, but we note we prefer "effective computation" for computation that is bounded by Turing computability and "neo-effective computation" for computation that is effective in Doyle's sense and not bounded by Turing computability, with "neo" indicating a new concept related to an older one. The numerous examples of notional "hypercomputers" see Copeland 9 for a review prompt similar questions.

Interestingly, a study of the expanding literature about the concept of an infinite-time Turing machine, introduced by Joel Hamkins and Andy Lewis in , shows that a number of computer scientists are prepared to describe the infinite-time machine as computing the halting function.


  • Follow Michael?
  • By Jack Copeland?
  • essays on depression in young adults!
  • An Exercise in Unplugging the Church-Turing Thesis [Part 2];
  • Church-Turing Thesis.
  • essay on being an architect.
  • The Church-Turing thesis in a quantum world.

Perhaps this indicates the concept of computation is already in the process of bifurcating into "effective" and "neo-effective" computation. In the computational literature the term "Church-Turing thesis" is applied to a variety of different propositions usually not equivalent to the original the-sis—CTT-O; some even go far beyond anything either Church or Turing wrote.

Several but not all are fundamental assumptions of computer science. Others such as the various physical computability theses we have discussed are important in the philosophy of computing and the philosophy of physics but are highly contentious; indeed, the label "Church-Turing thesis" should not mislead computer scientists or anyone else into thinking they are established fact or even that Church or Turing endorsed them.

Aharonov, D. Is quantum mechanics falsifiable? A computational perspective on the foundations of quantum mechanics. Copeland, C. Posy, and O. Shagrir, Eds. General relativistic hypercomputing and foundation of mathematics. Natural Computing 8 , 3 Sept. Bernstein, E. Quantum complexity theory. Castelvecchi, D. Paradox at the heart of mathematics makes physics problem unanswerable. Nature Dec. Church, A. An unsolvable problem of elementary number theory. American Journal of Mathematics 58 , 2 Apr.

Copeland, B. The broad conception of computation. American Behavioral Scientist 40 , 6 May , — Even Turing machines can compute uncomputable functions. Chapter in Unconventional Models of Computation , C. Calude, J. Casti, and M. Dinneen, Eds. Springer, Singapore, Narrow versus wide mechanism: Including a re-examination of Turing's views on the mind-machine issue. The Journal of Philosophy 97 , 1 Jan. Minds and Machines 12 , 4 Nov. Oxford University Press, Oxford, U. Alan Turing's forgotten ideas in computer science. Scientific American , 4 Apr. Do accelerating Turing machines compute the uncomputable?

Other places on the web

Minds and Machines 21 , 2 May , — Cubitt, T. Undecidability of the spectral gap. Nature , Dec. Davis, M. Information and Control 54 , 1—2 July , 3— Dershowitz, N. A natural axiomatization of computability and proof of Church's thesis. Bulletin of Symbolic Logic 14 , 3 Sept.

Your Answer

Deutsch, D. Quantum theory, the Church-Turing principle and the universal quantum computer. Doyle, J. What is Church's thesis? An outline. Eisert, J. Quantum measurement occurrence is undecidable.