[net.lang] Turing the first?

arnold@ucsfcgl.UUCP (Ken Arnold%CGL) (01/28/85)

<We don't need no steenking bug line!>

(I'm not sure this is the right news group, but I can't think of a
better.  I'm sure at least seven people will think of at least twelve
better ones, though.)

So that you don't have to wait until the end to decide whether you
care, I am herewith nominating Alan Turing for "World's First Hacker".
No, no, NO, I am NOT re-opening the question of whether "hacker" is a
good or bad term.  I just think that (either way) he is the first
person who fits this term in computer programming.

I have been reading the latest biography of Alan Turing ("Alan Turing:
The Enigma"; Andrew Hodges; Simon & Schuster).  I came across the
following section in the description of Turing's creation of ACE, one
of the first digital computers, and (as you shall see) the first in one
rather unusual aspect.

	Alan set out a way in which conditional branching could be done
	without needing the logical control to hold more than one
	'address' at a time.  It was not the best technical solution,
	but it had the merit of a brutal simplicity.  Suppose that it
	were wished to execute instruction 50 if some digit D were 1,
	and to execute instruction 33 it it were 0.  His idea was to
	'pretend that the instructions were really numbers and
	calculate D x (Instruction 50) + (1 - D) x (Instruction 33)'
	The result of this calculation would be a instruction which
	would have the effect that was required.  The 'IF' was effected
	not by hardware, but by programming.  It was a device which had
	led him to mix up data ... with instructions.  This in itself
	was of great significance, for he had allowed himself to
	*modify the stored program* [original emphasis].

An explanation follows of Van Neumann's machine which allowed program
modification only to step through a list of numbers.

	The Turing approach was very different.  Commenting on this
	feature of modifying instructions, he wrote in the report:
	'This gives the machine the possibility of constructing its own
	orders ... This can be very powerful.'

	... He dwelt in particular upon the question of doing
	arithmetic in floating-point form, and showed how the mere
	addition of two floating-point numbers required for a while
	instruction table [program].  He wrote some tables of this
	kind.  MULTIP, for instance, would have the effect of
	multiplying two numbers which had been encoded and stored in
	floating-point form, encoding and storing the result.  His
	tables made use of the 'very powerful' feature of the machine,
	for he had it assemble bits of the necessary instructions for
	itself, and the execute them.

Turing, it would seem, invented self-modifying code, the hacker's best
friend.  Now, admittedly he was working on a mercury loop memory storage
system with all sorts of memory limitations we wouldn't wish on a
toaster oven, and he surely needed this kind of flexibility, but this
seems to be the first creation of a tool no self-respecting hacker
would be without.  In the history of computing, can anyone think of an
earlier person whose creations make them proper holders of this label?

By the bye, I have been generally quite impressed with this book,
although its detail is quite overwhelming at times.  It's been
certainly worth my time; it might be worth yours.
-- 

		Ken Arnold
=================================================================
Of COURSE we can implement your algorithm.  We've got this Turing
machine emulator...

ags@pucc-i (Dave Seaman) (01/30/85)

Turing was not the first to think of self-modifying code.  Charles Babbage,
the inventor of the Analytical Engine, wrote about this concept in the
1840s.  His term for it was "the Engine eating its own tail."

Babbage's colleague Ada Augusta, the Countess of Lovelace (for whom the
Ada programming language is named) even anticipated the possibility of
machine translation of symbolic programs (i.e. compilers).
-- 
Dave Seaman			 ..!pur-ee!pucc-i:ags

arnold@ucsfcgl.UUCP (Ken Arnold%CGL) (02/03/85)

In article <870@pucc-i> ags@pucc-i (Dave Seaman) writes:
>Turing was not the first to think of self-modifying code.  Charles Babbage,
>the inventor of the Analytical Engine, wrote about this concept in the
>1840s.  His term for it was "the Engine eating its own tail."
>
>Babbage's colleague Ada Augusta, the Countess of Lovelace (for whom the
>Ada programming language is named) even anticipated the possibility of
>machine translation of symbolic programs (i.e. compilers).
>-- 
>Dave Seaman			 ..!pur-ee!pucc-i:ags

I admit not to having read Babbage's original paper on this, but my
understanding was that he used the phrase "eating its own tail" to
refer to conditional branches based on its own calculated values.  I
was talking about code which modified itself.  Below is a quote from a
1950's history of computing (I don't have the reference here at home,
but will be happy to provide it upon request) which describes Babbage's
concept:

	Babbage clearly understood the restrictions imposed by the
	inability of a machine to make decisions for itself.  He was
	able to take the next step and to suggest how to endow a
	machine with the minimum amount if "intelligence" which it
	needs.  As he expressed it, he made the machine "bite its own
	tail". ...

	If a machine is to perform the functions of a human computer
	[used here in the old sense, i.e., "one who computes" --KA], it
	must possess --

	(a) An arithmetic unit ...  (b) A memory ...  (c) A built-in
	power of judgement, which will enable the machine
	    to choose, according to prescribed criteria, the course
	    which the computation has to take.  (d) An input-ouput
	mechanism ...

This kind of decision is solely conditional branching; it is choosing
of alternatives.  Turing's concept was of the creation and modification
of programs themselves according, thus it is "self-modifying", as
distinct from Babbage's "self-controlling".
-- 

		Ken Arnold
=================================================================
Of COURSE we can implement your algorithm.  We've got this Turing
machine emulator...

ags@pucc-i (Dave Seaman) (02/07/85)

>  I admit not to having read Babbage's original paper on this, but my
>  understanding was that he used the phrase "eating its own tail" to
>  refer to conditional branches based on its own calculated values.  I
>  was talking about code which modified itself.  

According to Lady Lovelace, the Analytical Engine could store two kinds
of numbers:  those which represent "quantities" and those which represent
"operations."  As she explained,

	"...one main reason why the separate nature of the science
	of operations has been little felt, and in general little
	dwelt on, is the SHIFTING meaning of many of the symbols used
	in mathematical notation.  First the symbols of OPERATION are
	frequently also the symbols of the RESULTS of operations.
	We may say that these symbols are apt to have both a RETRO-
	SPECTIVE and a PROSPECTIVE signification.

	"...whenever numbers meaning OPERATIONS and not QUANTITIES
	(such as the indices of powers) are inscribed on any column
	or set of columns, those columns immediately act in a wholly
	separate and independent manner, becoming connected with the
	OPERATING MECHANISM exclusively, and re-acting upon this.
	They never come into combination with numbers upon any other
	columns meaning QUANTITIES; though, of course, if there are
	numbers meaning operations upon n columns, these may COMBINE
	AMONGST EACH OTHER, and will often be required to do so, just 
	as numbers meaning QUANTITIES combine with each other in any 
	variety."  [All emphasis in the original]

The single example which Ada Lovelace chose to illustrate operations,
indices of powers, is not what we would call self-modifying code
today.  The description is sufficiently general, however, to include
the concept of calculating arbitrary "operations" and then carrying
them out.  It appears that Ada may also be the inventor of strong
typing!
-- 
Dave Seaman			 ..!pur-ee!pucc-i:ags

arnold@ucsfcgl.UUCP (Ken Arnold%CGL) (02/09/85)

I admit from your quote that the concept of using instructions as
numbers was INHERENT in Babbage/Lovelace's thoughts, but I am not
convinced that they SAW this.  In fact, the idea is inherent in von
Neuman's early work, and even in the ENIAC, since it really is inherent
in the idea of machine instructions per se.  But, to my knowledge, no
one before Turing saw this at all, except in the limited sense of
instructions which looped through memory.

I consider this a major distinction; patents are often issued on things
which are inherent in current technology but which nobody happened to
see before.  We generally credit the individual who saw what was unseen
with the invention, not the people who created the potential but did
not see it.  I still haven't seen anything in Lovelace or Babbage's
work that indicates they realized that since the instructions were
numbers they could be created by computation as numbers.  Turing's ACE
did conditional branchings by evaluating an arithmetic expression
dependent upon the value being tested.  The expression generated an
appropriate 'branch' instruction, which was then executed.

>	"...whenever numbers meaning OPERATIONS and not QUANTITIES
>	(such as the indices of powers) are inscribed on any column
>	or set of columns, those columns immediately act in a wholly
>	separate and independent manner, becoming connected with the
>	OPERATING MECHANISM exclusively, and re-acting upon this.
>	They never come into combination with numbers upon any other
>	columns meaning QUANTITIES; though, of course, if there are
>	numbers meaning operations upon n columns, these may COMBINE
>	AMONGST EACH OTHER, and will often be required to do so, just 
>	as numbers meaning QUANTITIES combine with each other in any 
>	variety."  [All emphasis in the original]

She points out that there is no mix between instructions and data, that
they are kept seperate.  I took here statement about instruction
combining to mean that they are executed in concert.
-- 

		Ken Arnold
=================================================================
Of COURSE we can implement your algorithm.  We've got this Turing
machine emulator...