[net.arch] What I miss...

jans@orca.UUCP (Jan Steinman) (09/29/85)

In article <568@unisoft.UUCP> phil@unisoft.UUCP (Phil Ronzone) writes:
>In article <191@graffiti.UUCP> peter@graffiti.UUCP (Peter da Silva) writes:
>>> ...was it not the mariner probe that was lost due to a FORTRAN subscript
>>> error?... for my money I would prefer to see the [range checking] in for
>>> systems like nuclear plants, MX missiles etc..
>>
>> What should the code do when a range-check occurs? Print out an error
>> message on ticker-tape & hang? Do nothing?
>
> ...as a decade-long C programmer, I find about every 3 years some
>``discussion'' with a proponent of a more-or-less highly typed language
>(Ada most recent) over the ``virtues'' of Ada and its error checking over
>(denigration inserted here) C.
>
>Shucks - but I still CAN'T see how much the poor pilot in an F16 with an
>Ada programmed fire-control computer is going to be as a Mig-27 bears down
>on him, and right when he hears the lock-on buzz to fire his own missile --
>
>     Ada runtime error 498: subscript i value 23 out of range for array x (20)
>
>I mean,
>   
>     Bus error - core dumped
>
>reads quicker and thus gives the pilot a faster ``oh shit'' response time
>to take evasive action .... :-) :-)

****** FLAME ALERT FLAME ALERT FLAME ALERT FLAME ALERT FLAME ALERT *****
As a decade-long generalist who enjoys trying and contrasting new things, I
find people who are willing to go public with their lack of knowledge about
things they are unwillng to try truly amazing!

Sorry, Phil, but you really miss the point.  With range checks, the programmer
can **anticipate** such errors.  Ada (in particular) allows multi-level
trapping of such exceptions, so nuclear war does not rest on a single low-level
module's ability to do something sane with exceptions.  I, for one, would much
rather prefer to write an exception handler that will do something useful and
sane than to simply let the OS dump core.

I've stirred these waters before, and received hate mail for it.  I respect
the opinion of C hackers who have studied Ada enough to be able to identify
it's real problems, but I've really lost patience with those whom are so
imbued with C that they cannot understand how things could be any different!

One with a decade of experience with stone axes has no concept of how a
chainsaw works.  Phil, go out and USE Ada for a year or two, then I'll listen.
-- 
:::::: Artificial   Intelligence   Machines   ---   Smalltalk   Project ::::::
:::::: Jan Steinman		Box 1000, MS 61-405	(w)503/685-2956 ::::::
:::::: tektronix!tekecs!jans	Wilsonville, OR 97070	(h)503/657-7703 ::::::

brooks@lll-crg.UUCP (Eugene D. Brooks III) (10/02/85)

Could we please keep this discussion in net.ada, net.politics or net.religion.

I subscribed to net.ada for a month a year ago in apology to a ADA nut
for posting the statement "ADA sucks" to the net.  There were a total of two
articles on net.ada that month, which is proof enough that ADA is a language
that is devoid if serious use.  The only people who like it are those who can't
manage to write correct programs and need a crutch like subscript checking even
in a production version of a code.

If you program has a proof of correctness, and it checks its input data
properly, it does not need range checks on subscripts.  Such checking only
slows the computer down.  I don't have spare cycles for such a wast of time.
REAL programmers don't need subscript checking, they write lint free code
automatically.  Please leave your ADA hype on net.ada where no one is bothering
to read it!

peter@graffiti.UUCP (Peter da Silva) (10/02/85)

> One with a decade of experience with stone axes has no concept of how a
> chainsaw works.  Phil, go out and USE Ada for a year or two, then I'll listen.

I'd like to re-ask my question. What do you do in a finished product in a
high-risk environment when an unanticipated bug (anticipated errors will have
been dealt with in both languages if the programmer is worth his pay) occurs?

wdm@ecn-pc.UUCP (William D Michael) (10/03/85)

In article <879@lll-crg.UUCP> brooks@lll-crg.UUCP (Eugene D. Brooks III) writes:
>Could we please keep this discussion in net.ada, net.politics or net.religion.

    I disagree, let's keep it here.  Sorry, but these issues tie in 
    very closely to architecture issues.

>
>I subscribed to net.ada for a month a year ago in apology to a ADA nut
>for posting the statement "ADA sucks" to the net.  There were a total of two
>articles on net.ada that month, which is proof enough that ADA is a language
>that is devoid if serious use.  

    The proof you cite seems to be just a bit weak.  The thousands of 
    programmers working with ADA are pretty good proof that it is here to
    stay.  That doesn't mean you have to like it.  

>The only people who like it are those who can't
>manage to write correct programs and need a crutch like subscript checking even
>in a production version of a code.
>
>If you program has a proof of correctness, and it checks its input data
>properly, it does not need range checks on subscripts.  Such checking only
>slows the computer down.  I don't have spare cycles for such a wast of time.
>REAL programmers don't need subscript checking, they write lint free code
>automatically.  Please leave your ADA hype on net.ada where no one is bothering
>to read it!

    Right -- soft errors (or hard ones for that matter) never happen once
    code reaches production.  Not to mention things like tasks over-
    writing other tasks data areas and things of that sort.  Admittedly,
    if these things happen you've got problems, but if I were the captain
    of a 747, I would rather have the autopilot tell me to take over because
    it detected a non-recoverable error and was shutting down, than
    to have it attempt a manuever that would fold the wings like tin foil. 

    In all seriousness, if you don't have the cycles to do
    the things you mention, get a faster processor - it's cheap insurance
    against alot of real world perils. 

----------

mff@wuphys.UUCP (Swamp Thing) (10/05/85)

In article <879@lll-crg.UUCP> brooks@lll-crg.UUCP (Eugene D. Brooks III) writes:
>Could we please keep this discussion in net.ada, net.politics or net.religion.
>
>I subscribed to net.ada for a month a year ago in apology to a ADA nut
>for posting the statement "ADA sucks" to the net.  There were a total of two
>articles on net.ada that month, which is proof enough that ADA is a language
>that is devoid if serious use.  The only people who like it are those who can't
>manage to write correct programs and need a crutch like subscript checking even
>in a production version of a code.
>
>If you program has a proof of correctness, and it checks its input data
>properly, it does not need range checks on subscripts.  Such checking only
>slows the computer down.  I don't have spare cycles for such a wast of time.
>REAL programmers don't need subscript checking, they write lint free code
>automatically.  Please leave your ADA hype on net.ada where no one is bothering
>to read it!

The whole point of subscript checking, as far as I'm concerned, is to use
during development.  I don't see how slowing down the cpu is an issue for that.
On the other hand, if ADA doesn't allow you to turn off checking, I could see
that would be a pain.  I guess we're not all perfect programers.  And having
"a" proof of correctness hardly means that a complicated piece of code will
work as intended in all circumstances.

						Mark F. Flynn
						Department of Physics
						Washington University
						St. Louis, MO  63130
						ihnp4!wuphys!mff

------------------------------------------------------------------------------

"There is no dark side of the moon, really.
 Matter of fact, it's all dark."

				P. Floyd

ee178acb@sdcc7.UUCP (DARIN JOHNSON) (10/05/85)

>If you program has a proof of correctness, and it checks its input data
>properly, it does not need range checks on subscripts.  Such checking only
>slows the computer down.  I don't have spare cycles for such a wast of time.
>REAL programmers don't need subscript checking, they write lint free code
>automatically.  Please leave your ADA hype on net.ada where no one is bothering
>to read it!

  Most compilers that do run time checking have switches to turn this
off.  Many large programs would take enormous amounts of time to
prove correct in terms of range checking.  These types of errors
crop up very often when the program is written by a team or when
using pre-compiled modules.  Range checking will spot these errors
in a fraction of the time it takes to pour over your output.  Then,
if you need a lightning quick program, just turn off run time
checking and re-compile.  

  I am curious why this attack was on Ada, when there are very few
languages that don't have some form of range checking, etc.  Off the
top of my head, Assemblers and C are the only ones I can think of in
which I have never seen run time checks.  

  REAL programmers wouldn't be as naive to think that they were
always perfect.


  Darin Johnson
  UCSD 

mash@mips.UUCP (John Mashey) (10/05/85)

A long sequence of dicsussion got started by:
> In article <191@graffiti.UUCP> peter@graffiti.UUCP (Peter da Silva) writes:
> >> was it not the mariner probe that was lost due to a FORTRAN subscript error?
> >What should the code do when a range-check occurs? Print out an error message
> >on ticker-tape & hang? Do nothing? A better analogy, perhaps, would be...
> >
> >	"...like practicing sailing on shore with a mechanic [safety harness]
> >and leaving it on shore come the moment..."
> >
> >...you no longer have anything to attach them to.

1) Like everything else, doing subscript-checks and uninitialzied data checks
at run-time is a tradeoff: is it worth the cost or not?  Some of the arguments
that have been going on here fall into the domain-confusion problem, most
ofwhich would go away if people preferenced what they say by noting what
domain it applies to.  For example, subscript-checking might not be worthwhile
in a PC games program.  It might be elsewhere.

2) Good compiler technology can do fairly well at giving subscript-checking and
(some) uninitialized-variabel checking for nearly free; any good global
dataflow optimizer already has most ofthe data to do this.

3) It is unrealistic to think that systems must give up and die just because
they discover an internal error. I'm sure there are many counterexamples,
but there is one obvious one whose existence is familair to most people,
i.e., the telephone system's electronic switching machines.  They commonly
use error-detection/repair strategies that have multipl levels, i.e.,
if something minor goes wrong, they don't worry too much about zapping
one conversation; as damage is progressively worse, they might work all the way
up to a full reboot (anathema).  This approach is 20 years old; a recent
reference appeared in the latest ATT Technical Journal on #5ESS.  I urge people
to study old technology and uses thereof before claiming nonexistence.
-- 
-john mashey
UUCP: 	{decvax,ucbvax,ihnp4}!decwrl!mips!mash
DDD:  	415-960-1200
USPS: 	MIPS Computer Systems, 1330 Charleston Rd, Mtn View, CA 94043

jans@orca.UUCP (Jan Steinman) (10/06/85)

brooks@lll-crg.UUCP (Eugene D. Brooks III) writes:
>...I subscribed to net.ada for a month a year ago in apology to a ADA nut
>for posting the statement "ADA sucks" to the net.

You really have your own version of reality, don't you!  I was that "Ada nut"
who chastised you for your statement.  What I said then was not especially
nut like, and was not much different to where I have taken this discussion
when I edited the subject line: keep an open mind -- Ada is not the end of
language development, but it has some interesting and useful things.

>There were a total of two articles on net.ada that month, which is proof
>enough that ADA is a language that is devoid if serious use.

We netters are an incestuous lot, most of us running on machines that run
little but 'C' code.  If you look again, you'll notice that net.lang.ada
has been gateway'ed to ARPANET, and now has dozens of articles each time I
look.

>The only people who like it are those who can't manage to write correct
>programs and need a crutch like subscript checking even in a production
>version of a code.

For a real laugh, put this in your resume and apply to any DOD contractor.
They like it because the government has said they will use it.  Their
programmers like it because (accourding to Source EDP, a recruitment firm)
it is worth 20% more pay.  And I won't even get into why their evaluators,
porters, and QC people like it, but it's mostly due to enhanced reliability
and maintenability.

>If you program has a proof of correctness, and it checks its input data
>properly, it does not need range checks on subscripts.  Such checking only
>slows the computer down.  I don't have spare cycles for such a wast of time.

If you (sic) programming is as good as your spelling, don't wast (sic) your
time programming!  No, I'm not just being sarcastic -- the point is that the
best of us (even REAL programmers) make mistakes.  Sure, I know you were
just writing quickly.  How would you like to be able to program that quick
and have the language catch your mistakes, instead of some obnoxious netter?

>REAL programmers don't need subscript checking, they write lint free code
>automatically.  Please leave your ADA hype on net.ada where no one is
>bothering to read it!

Ada has many architectural features besides subscript checking that are
certainly of interest to open minded people.  When was the last time you
wrote tasking code in 'C'?  Did you use sockets or pipes?  How many trips
to the manual pages did it take?  How many times have you struggled with
'setjump' and 'longjump' in order to return to a non-local routine after
encountering exceptional circumstances?  The architecture of Ada, and the
issues of implementing them, are some of the hottest topics around, if you
know where they're being discussed and care to enlighten yourself!
-- 
:::::: Artificial   Intelligence   Machines   ---   Smalltalk   Project ::::::
:::::: Jan Steinman		Box 1000, MS 61-405	(w)503/685-2956 ::::::
:::::: tektronix!tekecs!jans	Wilsonville, OR 97070	(h)503/657-7703 ::::::

brooks@lll-crg.ARpA (Eugene D. Brooks III) (10/06/85)

Re: Use of subscript checking during program development.

I guess I did not make myself clear concering this with all the anti-ADA
bigotry confusing the real issue.

Subscript checking, pointer checking (ie have a tag associated with a pointer
returned by malloc and can be used to check for overrun of the allocated area)
and all that are very useful during program development.  These are very useful
tools and any serious programmer uses them during the development of a code.

Since these things are program development tools, where speed is not an issue,
they should be implemented in software and do not need the any hardware support.
Hence the discussion does not belong in net.arch.

All of this stuff about a subscript range check appearing on the console of an
F16 fire control system to save the pilots rear in a dogfight is simply too
absurd to comment about and I hope that more of it does not appear on net.arch.

tombre@crin.UUCP (Karl Tombre) (10/07/85)

>>The only people who like it are those who can't
>>manage to write correct programs and need a crutch like subscript checking even
>>in a production version of a code.
>>
>>If you program has a proof of correctness, and it checks its input data
>>properly, it does not need range checks on subscripts.  Such checking only
>>slows the computer down.  I don't have spare cycles for such a wast of time.
>>REAL programmers don't need subscript checking, they write lint free code
>>automatically.  Please leave your ADA hype on net.ada where no one is bothering
>>to read it!
>
>    Right -- soft errors (or hard ones for that matter) never happen once
>    code reaches production.  Not to mention things like tasks over-
>    writing other tasks data areas and things of that sort.  Admittedly,
>    if these things happen you've got problems, but if I were the captain
>    of a 747, I would rather have the autopilot tell me to take over because
>    it detected a non-recoverable error and was shutting down, than
>    to have it attempt a manuever that would fold the wings like tin foil. 
>

It it is dificult for me to understand that people can be so proud of their
favorite language that they do not see its weaknesses and understand that in
some applications another language would do much better. I myself program
mostly in C, but I am convinced that some other kinds of applications than
my own, ADA would be much better (and for others LISP and so on).
Saying that REAL programmers don't need subscript checking because they
write lint free code automatically seems a very arrogant position to me.
Beware! Some day you might be bogged down in problem too complex to solve
without help from range checking and such things. One main problem with C is
its lack of abstraction, and in very large projects I would recommend ADA.

No language is that good, nor that bad!
-- 
--- Karl Tombre @ CRIN (Centre de Recherche en Informatique de Nancy)
UUCP:    ...!vmucnam!crin!tombre  or    ...!inria!crin!tombre
COSAC:   crin/tombre
POST:    Karl Tombre, CRIN, B.P. 239, 54506 VANDOEUVRE CEDEX, France

"Car le plus lourd fardeau, c'est d'exister sans vivre."
                                  (Victor Hugo)

jer@peora.UUCP (J. Eric Roskos) (10/07/85)

> REAL programmers don't need subscript checking, they write lint free code
> automatically.  Please leave your ADA hype on net.ada where no one is
> bothering to read it!

Aside from the fact that the above might not be serious, it, and a couple
other recent postings, seem to reflect a common misunderstanding.  Just
because ADA uses "subscript checking", strong typing, etc., doesn't mean
these things are somehow peculiar to ADA; it's just that ADA attempts to
employ a lot of things that it has recently come to be believed are good
things to have.  This discussion is about things related to type checking,
detection of addresses that do not name valid objects, detection of
uninitialized values, etc., and only incidentally is related to ADA because
ADA happens to employ these principles to some extent.
-- 
Shyy-Anzr:  J. Eric Roskos
UUCP: Ofc:  ..!{decvax,ucbvax,ihnp4}!vax135!petsd!peora!jer
     Home:  ..!{decvax,ucbvax,ihnp4}!vax135!petsd!peora!jerpc!jer
  US Mail:  MS 795; Perkin-Elmer SDC;
	    2486 Sand Lake Road, Orlando, FL 32809-7642

jer@peora.UUCP (J. Eric Roskos) (10/08/85)

> I'd like to re-ask my question.  What do you do in a finished product in a
> high-risk environment when an unanticipated bug (anticipated errors will
> have been dealt with in both languages if the programmer is worth his pay)
> occurs?

Then, I'd like to re-answer it.  The idea here is that you want to try to
design your product in such a way that if errors occur, you will recover from
them.  Here there's sort of a problem with the term "unanticipated errors".
For example, suppose you have some flight-control system for a missile, and
an "unanticipated error" occurs, so the missile goes off course.  Well, you
would like, then, to have some other system monitoring the trajectory of
the missile, that says, "the missile is off course... I'll just disarm the
warhead, here", or maybe starts up a redundant guidance system, or something
like that.

The problem is that if you do this right, there shouldn't BE any unantic-
ipated errors; an unanticipated error would be something like if the
laws of physics quit working.  How well you design your system determines
how well you accomplish this; but the various forms of exception handling,
etc. that we have been discussing are supposed to make this easier by
allowing your program both to discover certain types of errors, and to remain
in control when these errors occur (rather than producing some error message
and halting, as some people have suggested).

As you said, in "both languages" (I don't remember what the other one was)
such a problem can be handled; the newer approaches (exception handlers and
the like) just try to make this easier, to make it less likely that the
programmer will do it wrong.
-- 
Shyy-Anzr:  J. Eric Roskos
UUCP: Ofc:  ..!{decvax,ucbvax,ihnp4}!vax135!petsd!peora!jer
     Home:  ..!{decvax,ucbvax,ihnp4}!vax135!petsd!peora!jerpc!jer
  US Mail:  MS 795; Perkin-Elmer SDC;
	    2486 Sand Lake Road, Orlando, FL 32809-7642

throopw@rtp47.UUCP (Wayne Throop) (10/09/85)

> I'd like to re-ask my question. What do you do in a finished product in a
> high-risk environment when an unanticipated bug (anticipated errors will have
> been dealt with in both languages if the programmer is worth his pay) occurs?

I don't have an answer (and I suspect there aren't any unique answers),
but I'd like to point out that this doesn't seem to be either an
architectural issue, nor a language issue.  In any language, on any
architecture, irrecoverable errors will occur (that is, fundamental
assertions about the state of the world that are necessary to allow the
program to proceed will be violated).

When this happens, the acceptable response will vary according to the
situation.  If a critical subroutine traps, the process might attempt to
recover (perhaps re-executing the subroutine after some fixup action).
If a critical process "core dumps", the system might attempt to recover
(perhaps by restarting the process at some checkpoint or other).  If a
critical process can't be restarted, the system might try to recover by
rebooting.  And so on and on.

But when push comes to shove, there will be some cases that just can't
be handled.  So, my fuzzy answer to "What do you do in a finished
product in a high-risk environment when an unanticipated bug occurs?" is
"The best you can".
-- 
Wayne Throop at Data General, RTP, NC
<the-known-world>!mcnc!rti-sel!rtp47!throopw

ludemann@ubc-cs.UUCP (Peter Ludemann) (10/10/85)

In article <272@graffiti.UUCP> peter@graffiti.UUCP (Peter da Silva) writes:
>I'd like to re-ask my question. What do you do in a finished product in a
>high-risk environment when an unanticipated bug (anticipated errors will have
>been dealt with in both languages if the programmer is worth his pay) occurs?

It's all quite simple.  I assume you're working in a multi-tasking
environment.  There's a parent task which starts up a family of
child processes.  The parent then does nothing but wait for a child
to die.

When a child dies, the operating system puts some information into
a log about the cause of death, sends a death message to the process's
parent and tidies up all the resources owned by the dead process.
Most likely, the parent, on receipt of the death message, just 
starts up a new child process.

Every so often, people come and read the error log to see if anything
has been going wrong.  Nobody (except programmers) ever sees 
"array subscript error at line 123 in procedure xyz".  And NEVER does
one see "bus error - core dumped" as some well-known systems do.
The user will probably only see a temporary degradation in the
performance of the system - which is much better than the system
going completely flakey because one process has gone outside an
array and clobbered something.
-- 
-- Peter Ludemann
	ludemann@ubc-cs.uucp (ubc-vision!ubc-cs!ludemann)
	ludemann@cs.ubc.cdn  (ludemann@cs.ubc.cdn@ubc.mailnet)
	ludemann@ubc.csnet   (ludemann%ubc.csnet@CSNET-RELAY.ARPA)

jer@peora.UUCP (J. Eric Roskos) (10/11/85)

> Since these things are program development tools, where speed is not an
> issue, they should be implemented in software and do not need the any
> hardware support.  Hence the discussion does not belong in net.arch.

I think our compiler-writers would probably disagree with you... certain
types of checking (e.g., checking for addresses in a certain range, which
I suggested back at the beginning of this discussion) is enormously
difficult to do without hardware support... I have heard of debuggers that
had to completely emulate the instruction set in order to do this.
-- 
Shyy-Anzr:  J. Eric Roskos
UUCP: Ofc:  ..!{decvax,ucbvax,ihnp4}!vax135!petsd!peora!jer
     Home:  ..!{decvax,ucbvax,ihnp4}!vax135!petsd!peora!jerpc!jer
  US Mail:  MS 795; Perkin-Elmer SDC;
	    2486 Sand Lake Road, Orlando, FL 32809-7642

barmar@mit-eddie.UUCP (Barry Margolin) (10/13/85)

In article <210@rtp47.UUCP> throopw@rtp47.UUCP (Wayne Throop) writes:
>...  So, my fuzzy answer to "What do you do in a finished
>product in a high-risk environment when an unanticipated bug occurs?" is
>"The best you can".

Your posting made some good points, but I would like to elaborate on
your simple summation.  In my opinion, and I suspect also those of the
designers of fancy condition-handling mechanisms, the answer is "the
best that the language and architecture permit."  Pascal, COBOL, and C,
as far as I know, provide the programmer with very little capability to
detect problems and deal with them; programs will just abort when some
conditions occur, and there is nothing that can be done to automatically
determine why, in order to decide what action to take.  In PL/I, Ada,
some BASICs, and CLU there are relatively powerful condition mechanisms,
which permit the program to recognize many abnormal states.  Yes, there
will always be situations in which this will fail; for instance, the
stack might be screwed up due to an assignment through a busted pointer.
But it is best if "the best you can" translates to "detect most problems
and deal with them appropriately."
-- 
    Barry Margolin
    ARPA: barmar@MIT-Multics
    UUCP: ..!genrad!mit-eddie!barmar

weltyrp@rpics.UUCP (Richard Welty) (10/21/85)

brooks@lll-crg.UUCP (Eugene D. Brooks III) writes:
>The only people who like it are those who can't manage to write correct
>programs and need a crutch like subscript checking even in a production
>version of a code.
 
>REAL programmers don't need subscript checking, they write lint free code
>automatically.  Please leave your ADA hype on net.ada where no one is
>bothering to read it!

*flame on*

I am not a big fan of ADA, but the preceeding is a load of garbage.

rule 1:  Large systems have bugs

rule 2:  the more checking the language system does for you, either
         at run time or at compile time, the better off you are

rule 3:  the earlier that the system catches a bug (compile time is
         best), the better ...

Correctness proofs are a nice idea, and worthy of many research
dollars, but are far from being able to deal with the problems
that developers of large systems have today.

While I consider C to be more generally useful than Pascal
(and Ada, at the current time), there things about Pascal that
I miss a great deal ... and every time I find a bug in my C code
that Pascal would have flaged at compile time, I miss Pascal more
(and as a VMS C programmer, I don't have access to lint -- I wish
I did).
-- 
				Rich Welty

	(I am both a part-time grad student at RPI and a full-time
	 employee of a local CAE firm, and opinions expressed herein
	 have nothing to do with anything at all)

	CSNet:   weltyrp@rpi
	ArpaNet: weltyrp.rpi@csnet-relay
	UUCP:  seismo!rpics!weltyrp