[comp.lang.misc] Machine Language Typing

olson@juliet.ll.mit.edu ( Steve Olson) (05/03/91)

In article <STEPHEN.91May3051250@estragon.uchicago.edu> stephen@estragon.uchicago.edu (Stephen P Spackman) writes:
   I have to disagree. A good machine language (one with no Catch Fire &
   Burn instructions) for a machine without programmer-visible caching is
   both strongly and statically typed, simply by virtue of the fact that
   all consequences of a given machine state are defined. You are making
   the mistake of imagining that addresses and ints and so forth exist in
   object code; but they don't. Those are abstractions in your mind.
   Ultimately they are NOT represented in the machine model. Bits go
   here, bits go there, the memory holds bits and nothing else. The
   (only) data type is the bit....

If one puts an integer into a floating point instruction, one might
get a deteriminstic result. But so what?  The result wouldn't be
meaningful.  I might as well argue the only real data type is the
voltage level.  

   I'm really not trying to be silly here. The fact that machines are
   strongly typed underlyingly is the thing that makes secure
   implementations possible. If something could happen to be in memory
   that wasn't a bit, no language layered above could ever have
   guaranteed behaviour... (which is why it's bad when the hardware fails
   - untrapped type violations then CAN occur.... :-).

   It's sensible to ask, then, how we can implement UNTYPED langauges on
   top of this typed substrate. My guess is it's because memory is
   finite.... Which is to say, it's done with smoke & mirrors.

According to your definitions a weakly typed language is impossible on
deterministic hardware.  Depending on how you set up your starting 
definitions, there might be some narrow, literal truth to it, but it
dosen't seem like a terribly useful way to think about programming languages.

   ----------------------------------------------------------------------
   stephen p spackman         Center for Information and Language Studies
   systems analyst                                  University of Chicago
   ----------------------------------------------------------------------
--
-- Steve Olson
-- MIT Lincoln Laboratory
-- olson@juliet.ll.mit.edu
--

hrubin@pop.stat.purdue.edu (Herman Rubin) (05/04/91)

In article <OLSON.91May3124040@lear.juliet.ll.mit.edu>, olson@juliet.ll.mit.edu ( Steve Olson) writes:
> In article <STEPHEN.91May3051250@estragon.uchicago.edu> stephen@estragon.uchicago.edu (Stephen P Spackman) writes:
>    I have to disagree. A good machine language (one with no Catch Fire &
>    Burn instructions) for a machine without programmer-visible caching is
>    both strongly and statically typed, simply by virtue of the fact that
>    all consequences of a given machine state are defined. You are making
>    the mistake of imagining that addresses and ints and so forth exist in
>    object code; but they don't. Those are abstractions in your mind.
>    Ultimately they are NOT represented in the machine model. Bits go
>    here, bits go there, the memory holds bits and nothing else. The
>    (only) data type is the bit....
> 
> If one puts an integer into a floating point instruction, one might
> get a deteriminstic result. But so what?  The result wouldn't be
> meaningful.  I might as well argue the only real data type is the
> voltage level.  

It depends.  I deliberately do things like this, but most often using
integer operations on floats.  Of course the results are machine
dependent, but so what?  However, putting integers into floating
instructions can be useful if those integers are read by the machine
as the desired floating numbers.

Now suppose I do an integer operation on floats.  Is the result float
or integer?  Both are useful, and I expect any good mathematical programmer
to be able to use these profitably.
-- 
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907-1399
Phone: (317)494-6054
hrubin@l.cc.purdue.edu (Internet, bitnet)   {purdue,pur-ee}!l.cc!hrubin(UUCP)

stephen@pesto.uchicago.edu (Stephen P Spackman) (05/04/91)

In article <OLSON.91May3124040@lear.juliet.ll.mit.edu> olson@juliet.ll.mit.edu ( Steve Olson) writes:

|In article <STEPHEN.91May3051250@estragon.uchicago.edu> stephen@estragon.uchicago.edu (Stephen P Spackman) writes:
|   I have to disagree. A good machine language (one with no Catch Fire &
|   Burn instructions) for a machine without programmer-visible caching is
|   both strongly and statically typed, simply by virtue of the fact that
|   all consequences of a given machine state are defined. You are making
|   the mistake of imagining that addresses and ints and so forth exist in
|   object code; but they don't. Those are abstractions in your mind.
|   Ultimately they are NOT represented in the machine model. Bits go
|   here, bits go there, the memory holds bits and nothing else. The
|   (only) data type is the bit....
|
|If one puts an integer into a floating point instruction, one might
|get a deteriminstic result. But so what?  The result wouldn't be
|meaningful.  I might as well argue the only real data type is the
|voltage level.  

I don't believe so, because the machine language is defined in terms
of bits and not voltages. If we were looking at hardware bus
behaviours at 100MHz or more, I'd agree with you.

|   I'm really not trying to be silly here. The fact that machines are
|   strongly typed underlyingly is the thing that makes secure
|   implementations possible. If something could happen to be in memory
|   that wasn't a bit, no language layered above could ever have
|   guaranteed behaviour... (which is why it's bad when the hardware fails
|   - untrapped type violations then CAN occur.... :-).
|
|According to your definitions a weakly typed language is impossible on
|deterministic hardware.  Depending on how you set up your starting 
|definitions, there might be some narrow, literal truth to it, but it
|dosen't seem like a terribly useful way to think about programming languages.

Oh, no! What the language does NOT define is of the essence. Each
impemetation is in some sense underlyingly typed, but that doesn't
reflect on the language at all. Compilers will be specified in terms
of language semantics, not machine code translations....

But I will admit to the philosophical stance that "untyped" data do
not exist - it may be convenient to pretend otherwise, but ultimately
the only reason a function should not validate its arguments is if it
is intended to process non-data - if it's a GIGO engine. And I just
don't see how this can possibly be relevant to any human endeavour.
----------------------------------------------------------------------
stephen p spackman         Center for Information and Language Studies
systems analyst                                  University of Chicago
----------------------------------------------------------------------