[comp.arch] More math...

jps@cat.cmu.edu (James Salsman) (07/15/89)

Most complexity theory that evaluates the difference
between regular algorithms and NP-complete algorithms
seems to be based on turing machines.

Turing machines are the paridigm of serial machines!

How can all these classifications be correct for
multiprocessors and other parallel machines?

:James
-- 

:James P. Salsman (jps@CAT.CMU.EDU)

mount@hpcupt1.HP.COM (John Mount) (07/16/89)

/ hpcupt1:comp.arch / jps@cat.cmu.edu (James Salsman) /  8:10 pm  Jul 14, 1989 /
>Most complexity theory that evaluates the difference
>between regular algorithms and NP-complete algorithms
>seems to be based on turing machines.

Yup.

>Turing machines are the paridigm of serial machines!

Yup.

>How can all these classifications be correct for
>multiprocessors and other parallel machines?

Because unless your parallel processor has an *infinite* number of processors
it is just a big serial machine (by any other name :-).  And if you do have
an infinite amount of processor then NP-complete problems can be solved in
polynomial time on such a beast (so NP-compeleteness becomes a little less
interesting in this case).

John Mount

stuart@bms-at.UUCP (Stuart Gathman) (07/17/89)

In article <-286679999@hpcupt1.HP.COM>, mount@hpcupt1.HP.COM (John Mount) writes:
> Because unless your parallel processor has an *infinite* number of processors
> it is just a big serial machine (by any other name :-).  And if you do have
> an infinite amount of processor then NP-complete problems can be solved in
> polynomial time on such a beast (so NP-compeleteness becomes a little less
> interesting in this case).

In a sci-fi story by Asimov, they built such a beast in "hyper-space".  After
billions of years (polynomial time), it finished its computations and
said, "Let there be light . . .".
-- 
Stuart D. Gathman	<stuart@bms-at.uucp>
			<..!{vrdxhq|daitc}!bms-at!stuart>

drago@bnr-rsc.UUCP (Drago Goricanec) (07/18/89)

In article <5533@pt.cs.cmu.edu> jps@cat.cmu.edu (James Salsman) writes:
>
>Most complexity theory that evaluates the difference
>between regular algorithms and NP-complete algorithms
>seems to be based on turing machines.
>
>Turing machines are the paridigm of serial machines!
>
>How can all these classifications be correct for
>multiprocessors and other parallel machines?

If I remember correctly from my Formal Languages class, a parallel machine
can be simulated by a multi-tape turing machine, which was shown to be
equivalent to a regular turing machine.  I believe the assumption is
that the number of tapes must be finite, since there must be a finite
number of letters in the tape alphabet.  This would imply that the
NP-complete algorithms are also correct for parallel machines given
that the number of multiprocessors is finite.

P.S.  Please don't flame if I got anything wrong, just point me
in the right direction.
-- 
Drago Goricanec                              (...!bnr-vpa!bnr-rsc!drago)
Bell-Northern Research Ltd., Ottawa, Ontario              (613) 763-8957

preston@titan.rice.edu (Preston Briggs) (07/18/89)

In article <898@bnr-rsc.UUCP> drago@bnr-rsc.UUCP (Drago Goricanec) writes:
>In article <5533@pt.cs.cmu.edu> jps@cat.cmu.edu (James Salsman) writes:

>>How can all these classifications be correct for
>>multiprocessors and other parallel machines?
>
>If I remember correctly from my Formal Languages class, a parallel machine
>can be simulated by a multi-tape turing machine, which was shown to be
>equivalent to a regular turing machine.  I believe the assumption is

An informal argument is simpler.
Imagine you have a program solving some NP-Complete problem,
say finding a minimal coloring for a graph.
It will take time proportional
to 2^N (where N is number of node in the graph) to run.

If you have a 1000 processor parallel machine
and the program maps with no overhead, you can run
the program 1000 times faster.
However, it's still in time proportional to 2^N.

The point of exponential time is that we can overwhelm
{\em any\/} increase in machine speed by increasing the
problem size by some tiny amount.

Preston