[comp.ai] Ignorance about the ignorant assumption

ian@shire (Ian Parberry) (10/05/88)

In article <7202@aw.sei.cmu.edu> firth@bd.sei.cmu.edu (Robert Firth) writes:

>To elaborate:  I can build a box, whose main constituents are a supply
>of photons and a half-silvered mirrir, that, when triggered, will emit
>at random either the value "0" or the value "1".  This can be thought
>of as a mapping
>
>	{0,1} => 0|1
>
>where I introduce "|" to designate the operator that arbitrarily selects
>one of its operands.  The obvious generalisation of this - the function
>that selects an arbitrary member of an input set - is surely not unfamiliar.
>
>Nobody has denied that a Turing machine can't do this.  The assertion that
>a physical system can do it rests on the quantum theory; in particular on
>the proposition that the indeterminacy this theory ascribes to the
>physical world is irreducible.  Since every attempt to build an alternative
>deterministic theory has foundered, and no prediction of the quantum theory
>has yet been falsified, this rests on pretty strong ground.

Have I missed something here?  Theoretical Computer Scientists have
been stepping beyond the bounds of the Church-Turing thesis for years.
The obvious question which was first asked a long time ago (I believe that
Michael Rabin was amongst the first to do so) is whether the kind of
randomness described above helps computation.

An obvious answer is that it sometimes helps speed things up.
For example, there are several polynomial time probabilistic primality testing
algorithms, but no deterministic one is known (although it must be admitted
that one exists if the Riemann Hypothesis is true).

The Church-Turing thesis is not and never has been a sacred cow amongst
Theoretical Computer Scientists.  Most view it as a handy rule of thumb,
and nothing else.  It's not hard to invent machines which violate the
Church-Turing thesis.  The hard part is developing a non-trivial,
entertaining, elegant and useful theory of computation around them.
My favourite work of this kind is on non-uniform circuit complexity.
Curiously, many lower-bounds proved to date hold for non-uniform
(read non-Church-Turing- thesis) circuits, and have matching uniform
(read Church-Turing-thesis) upper-bounds.

Most of the postings I've seen to date have been from non-TCS people.
However, since the Church-Turing thesis is a part of Theoretical
Computer Science, it is worth finding out what the TCS'ers have had
to say about it.

For a ton of reading, look for articles that mention the key words
probabilistic algorithm, RP, BPP, RNC in the proceedings from the IEEE
Symposium on Foundations of Computer Science and the ACM Symposium on the
Theory of Computing for the last decade.  For more polished but less
up-to-date material, consult theory journals such as SIAM J. Computing,
Journal of the ACM, Theoretical Computer Science, Journal of Computer
and System Sciences, Journal of Algorithms, Information Processing Letters.

Of course, I'm not saying that Theoretical Computer Scientists have
all of the answers.  But they do seem to have made a good try
at addressing the obvious questions.
-------------------------------------------------------------------------------
			Ian Parberry
"The bureaucracy is expanding to meet the needs of an expanding bureaucracy"
  ian@psuvax1.cs.psu.edu  ian@psuvax1.BITNET  ian@psuvax1.UUCP  (814) 863-3600
 Dept of Comp Sci, 333 Whitmore Lab, Penn State Univ, University Park, Pa 16802