[net.arch] Range-checking isn't an architecture issue

bprice@bmcg.UUCP (Bill Price) (09/28/85)

mid-discussion)
Summary: 
Expires: 
References: <796@kuling.UUCP> <2580002@csd2.UUCP> <191@graffiti.UUCP>
Sender: 
Reply-To: bprice@bmcg.UUCP (Bill Price)
Followup-To: 
Distribution: 
Organization: Burroughs Advanced Systems Group, San Diego
Keywords: 

In article <191@graffiti.UUCP> peter@graffiti.UUCP (Peter da Silva) writes:
 Quoting a prior message:
>> 	"...leaving range checks out is rather like practising sailing on
>> shore with life belts and then leaving them on shore come the moment.."
>> 	Knuth???

No, not Knuth.  'Twas C. A. R. (Tony) Hoare that said it.

>> I agree with you to a point. For low-risk code leave them out, But for my 
>> money I would prefer to see the code in for systems like nuclear plants,
>> MX missiles etc..
>What should the code do when a range-check occurs? Print out an error message
>on ticker-tape & hang? Do nothing? 

But it's only the low-risk code that can afford to have range-checks.  For
the high-risk code, failure of a range-check is just another catatrophic
way for the program to fail.

The solution that is obvious to me (but not to everybody) does put a little
extra programmer-work up front, at the cost of saving a possible bunch
later on.  My solution requires the compiler to
1.  Implement the "Economic Range Checks ..." feature, per Welsh, enhanced
    to consider the post-conditions of all statements and clauses. 
2.  Accept an extended set of invariants, in addition to the type
    invariants which give Pascal so much power.  These invariants could
    be expressed as 'assert' statements, but the compiler doesn't just
    chicken out and generate code.  The assertion gets validated, as
    not in conflict with the known state of the computation; and then
    is treated as part of that state.
3.  Each assertion that is relied on to suppress a check causes a report
    to be written:  this report is a proposition to be proven by the
    programmer.  In a well-run organization, or to a self-respecting
    programmer, this proposition must be proven and becomes part of
    the programmer's certification of the program.
4.  Provide a low-risk vs high-risk option:  low-risk generates whatever
    range checks remain after (1) and (2) above, flagging them to the 
    programmer.  The high-risk option activates (3).

Notice that there is no architectural implication here, at least, not a
first-order effect.  There are some second-order effects, though:
1.  The architecture must be complete with respect to a useful, high-level
    language, such as Pascal.  It is necessary that the entire program
    be within the scope of the scheme above, in order that the scheme be
    expected to work.

2.  The architecture should be efficient w.r.t. the programming language, 
    so that the compiler writer (and thus the compiler) can spend time and
    effort in implementing the scheme, rather than in just coping with the
    inhospitability of typical architectures.  The compiler must be simple
    enough to have a good chance of being correct (!), and thus, the 
    underlying architecture has to be good for the purpose.

--Bill Price
-- 
--Bill Price  uucp:  {Most Anybody}!sdcsvax!bmcg!bprice
              arpa:? sdcsvax!bmcg!bprice@nosc

guy@sun.uucp (Guy Harris) (10/13/85)

> 2.  The architecture should be efficient w.r.t. the programming language, 
>     so that the compiler writer (and thus the compiler) can spend time and
>     effort in implementing the scheme, rather than in just coping with the
>     inhospitability of typical architectures.  The compiler must be simple
>     enough to have a good chance of being correct (!), and thus, the 
>     underlying architecture has to be good for the purpose.

What do you mean by "efficient with respect to the programming language?"
Not all architectural features which "look" like they make it easier to
implement a language on that architecture really help.  Adding features can
result in increasing the complexity of the hardware and the microcode in
exchange for reduced complexity in the compiler, and this may or may not be
a good tradeoff.

Furthermore, it's not even clear that it will simplify the compiler.  Good
optimizing compilers can make a difference even on stack machines, and
possibly even on the iAPX432 - in the article "Computers, Complexity, and
Controversy", by Colwell, Hitchcock, Jensen, Sprunt, and Kollar, in the
September 1985 IEEE Computer, the authors indicate that

	The 432's Ada compiler performs almost no optimization.
	The machine is frequently forced to make unnecessary
	changes to its complex addressing environment, and it often
	recomputes costly, redundant subexpressions.

	Guy Harris