[comp.lang.c] "C" vrs ADA

harman@vu-vlsi.UUCP (Glen Harman) (08/18/87)

Hello, sorry to interrupt your regularly scheduled news articles, but I
didn't know where else to turn...

I am a senior EE student whose current career goal is to work in an aerospace
and/or military research field.  To better my chances, I would like to 
supplement my Fortran skills with another language.  "C" has been recommended
to me many times, and I was just about to buy a manual when I started hearing 
about ADA.

I have heard that is is the DoD language, but what does that mean?  Are all 
aerospace and military contractors required to use it?  Is it suggested learningfor the major corporate engineers?  Is it filtering down into the public 
engineering sectors?  Is it too specialized to be applied elsewhere if I didn't
get the desired job?

Being that I am relatively unfamiliar with what is being used in these fields,
I am turning to those of you in the know.  If you have any thoughts on the pros
and cons of either language, and/or would care to make a suggestion, I would
greatly appreciate it. If you could, please recommend a book on the subject. 

Please send replies to:  \!{cbmvax, pyrnj, bpa }\!vu-vlsi\!harman

Thank you for you help!

Glenvar Harman

spf@clyde.UUCP (08/18/87)

In article <1065@vu-vlsi.UUCP> harman@vu-vlsi.UUCP (Glen Harman) writes:
>supplement my Fortran skills with another language.  "C" has been recommended
>to me many times, and I was just about to buy a manual when I started hearing 
>about ADA.

Learn them both.  C is the assembly language, and Ada the High Order
Language (HOL), of the next ten years in the DoD community.  The DoD
doesn't much like C from a lifecycle point of view, but has trouble
denying its availability and current performance advantage over Ada
(just like assembly with respect to FORTRAN 20+ years ago).

Steve Frysinger

---
Why would I waste my time expressing someone else's opinion?

glg@sfsup.UUCP (G.Gleason) (08/18/87)

In article <1065@vu-vlsi.UUCP> harman@vu-vlsi.UUCP writes:
>
>Hello, sorry to interrupt your regularly scheduled news articles, but I
>didn't know where else to turn...
>
>I am a senior EE student whose current career goal is to work in an aerospace
>and/or military research field.  To better my chances, I would like to 
>supplement my Fortran skills with another language.  "C" has been recommended
>to me many times, and I was just about to buy a manual when I started hearing 
>about ADA.

This message does not really belong in the "space" groups, please limit
the number of newsgroups you post to.

This is a dumb question, if you are really interested in learning about
EE and computers, etc., you should be interested in learning many
languages.  C is a good place to start, but any of the block structured
languages will do, and C is probably one of the most widespread.  I'm
sure you will eventually need to learn ADA to do the work you are planning.
I have not yet needed to use it, but I as I understand, it is pretty
complex, and probably not a good choice if you don't already know other
languages.  I better idea is to choose a language that is available to you
so you can learn by doing.

Gerry Gleason

jru@etn-rad.UUCP (John Unekis) (08/18/87)

In article <1065@vu-vlsi.UUCP> harman@vu-vlsi.UUCP (Glen Harman) writes:
>
>Hello, sorry to interrupt your regularly scheduled news articles, but I
>didn't know where else to turn...
>
>I am a senior EE student whose current career goal is to work in an aerospace
>and/or military research field.  To better my chances, I would like to 
>supplement my Fortran skills with another language.  "C" has been recommended
>to me many times, and I was just about to buy a manual when I started hearing 
>about ADA.
>
>I have heard that is is the DoD language, but what does that mean?  Are all 
>aerospace and military contractors required to use it?  Is it suggested learningfor the major corporate engineers?  Is it filtering down into the public 
>engineering sectors?  Is it too specialized to be applied elsewhere if I didn't
>get the desired job?
>

	  The ada language is far more than just a language. Ada includes
	  standards for editors,compilers, and run-time symbolic debuggers.

	  The C language evolved at AT&T in the process of developing the
	  UNIX operating system. There were, beleive it or not, an A language
	  and a B language that preceded it. Finally with the C language the
	  original developer of the UNIX operating system (which was done on
	  a PDP-7 microcomputer) felt that he had what he wanted. It was a
	  moderately structured language , with a syntax that was similar
	  to the UNIX c shell (or vice versa). As UNIX gained wide acceptance
	  the C language became more popular. It has the advantage over 
	  FORTRAN of having structured variable types, but without the 
	  overly elaborate type checking done by a language like PASCAL.
	  It does share with Pascal the fact that each was developed by a
	  single individual and thus represents that individuals prejudices
	  in computer languages. C is now widely available outside the 
	  UNIX community and is a defacto standard with many companies. 
	  It is often the case in military/aerospace procurements that in
	  order to generalize the request for computer hardware so as not
	  to sole source a single vendor the government will ask for UNIX
	  because it is the only operating system that can be had on a 
	  non-proprietary basis on a variety of computer hardware. UNIX of
	  course brings C right along with it.

	  Because UNIX does not handle real-time applications (such as
	  interrupt handling) very well, and because there was no non-
	  proprietary standard for real-time operating systems, the 
	  government(DOD) wanted to develop such a standard. Also, the DOD
	  had a problem with each branch of the service having its own
	  non-compatible languages for software development(COBOL,JOVIAL,
	  CMS-II,FORTRAN,C,etc.). It has been decided that the DOD will 
	  develop a standard computer language that will include standards
	  for editors, compilers, run-time debuggers, and even operating
	  system functions for real-time processing. This standard was 
	  named ADA, (the name of the mistress of Charles Babbage, who 
	  invented a punched card driven loom, considered to be the first
	  computer, she was rumored to be the first person to ever write
	  a program on punched cards- why her name is appropriate for a
	  real-time language is a mystery). If you are interested in the
	  military/aerospace field then ADA is definitely the language to
	  learn. Be aware that it is a very complex language (Carnegie
	  Mellon University is rumored to have required it for all sophomores-
	  which resulted in flunking out half their sophomore class) and
	  that to learn it properly you must find a DOD certified 
	  implementation which includes the editors, compilers, and debuggers
	  as well. The DOD plans eventually to require ALL software to be done
	  in ADA, but they realize that there is enormous inertia against it.
	  Thousands of programmers must be retrained, and millions of lines
	  of code converted. Don't expect to see ADA used very widely outside
	  of the DOD environment. It will fail for the same reason that
	  Pascal, Modula2, C, PL1, and others have failed - IBM is the 
	  dominant force in the commercial market(~75 percent of all 
	  commercial installations) and COBOL dominates the IBM installed
	  base (~90 percent of IBM applications are in COBOL). As long as
	  computers remain basically Von Neuman processors, no language is
	  going to offer any advantages in the real world to a language
	  like COBOL. No business is going to go through the 3 to 5 years
	  effort of retraining and converting of existing code just to
	  satisfy the dogmatic prejudices of computer-science weenies.

	  The DOD is perfectly capable, however, of making contractors like
	  Boeing, Lockheed, TRW, Eaton,etc. jump through the ADA hoop just
	  by refusing to accept bids which do not include ADA. Therefore if
	  you want a career in military/aerospace, go for ADA.


---------------------------------------------------------------
ihnp4!wlbr!etn-rad!jru   - The opinions above were mine when I
thought of them, by tomorrow they may belong to someone else.

daveh@cbmvax.UUCP (Dave Haynie) (08/18/87)

in article <1065@vu-vlsi.UUCP>, harman@vu-vlsi.UUCP (Glen Harman) says:
> I am a senior EE student whose current career goal is to work in an aerospace
> and/or military research field.  To better my chances, I would like to 
> supplement my Fortran skills with another language.  "C" has been recommended
> to me many times, and I was just about to buy a manual when I started hearing 
> about ADA.
> 
> Is it suggested learning for the major corporate engineers?  Is it
> filtering down into the public engineering sectors?  Is it too specialized
> to be applied elsewhere if I didn't get the desired job?

The biggest problems with Ada is the massive size of its compiler and the
verbosity of it's language.  I guess if you really like Pascal or Modula2,
you might adjust, but if you're used to C, it might take some getting used
to.  Coming from a Fortran background, any modern language would be a step
up, certainly.  The compiler size is a concern when it comes to relying on
that compiler to produce accurate code.  It's certainly simpler to produce
an accurate compiler for a small language like C or Pascal than for a very
large on like Ada.  

As for where it's used, mainly DOD I'd guess.  It certainly isn't used much,
if any, in commercial or industrial sectors.  C's the language for most of
these, though your Fortran experience could come in handy in some heavy
duty scientific fields (most machines have Fortran compilers that code better
than existing C compilers, but since there's more work being done on C, I
wouldn't be surprised if this is changing).  Ada is certainly general purpose
enough to be used elsewhere if you have access to a compiler for it, and it
generally has alot of things built into it that you have to add to C language
(like tasks, exceptions, etc.).

> Being that I am relatively unfamiliar with what is being used in these fields,
> I am turning to those of you in the know.  If you have any thoughts on the pros
> and cons of either language, and/or would care to make a suggestion, I would
> greatly appreciate it. If you could, please recommend a book on the subject. 

Any language you learn will help you when the next one comes along.  If you're
not pressed for time, the best thing to start off with would probably be a 
good book on general computer science; there's alot more to this than what
you've seen in Fortran.  A book I would recommend is "Fundamental Structures
of Computer Science", by Wulf, Shaw, Hilfinger, and Flon; Addison-Wesley, 1981.
I studied EE and CS in college; in school I had used mainly Pascal and LISP,
and some C, SNOBOL, and APL.  My first REAL summer job required that I learn
PL/M-80; my first job after graduation required that I learn ISPS, Bliss,
VAX Macro Assembler, and Fortran.  If you know the things to expect in any
language, you can pick up new, unexpected ones rather quickly.

> Please send replies to:  \!{cbmvax, pyrnj, bpa }\!vu-vlsi\!harman
> 
> Thank you for you help!
> 
> Glenvar Harman
-- 
Dave Haynie     Commodore-Amiga    Usenet: {ihnp4|caip|rutgers}!cbmvax!daveh
"The A2000 Guy"                    PLINK : D-DAVE H             BIX   : hazy
     "I'd rather die while I'm living, than live while I'm dead"
						-Jimmy Buffett

ark@alice.UUCP (08/19/87)

In article <12513@clyde.ATT.COM>, spf@clyde.UUCP writes:
> Learn them both.  C is the assembly language, and Ada the High Order
> Language (HOL), of the next ten years in the DoD community.

Ada is the Cobol of the 70's.

steve@gondor.psu.edu (Stephen 2. Williams) (08/20/87)

In article <7203@alice.UUCP> ark@alice.UUCP writes:
>Ada is the Cobol of the 70's.


Sounds about right.

--Steve

jef@unisoft.uucp (Jef Poskanzer) (08/21/87)

In the referenced article, ark@alice.UUCP wrote:
>In article <12513@clyde.ATT.COM>, spf@clyde.UUCP writes:
>> Learn them both.  C is the assembly language, and Ada the High Order
>> Language (HOL), of the next ten years in the DoD community.
>
>Ada is the Cobol of the 70's.

C is the FORTRAN of the '90s.
---
Jef

    Jef Poskanzer  unisoft!jef@ucbvax.Berkeley.Edu  ...ucbvax!unisoft!jef
                      Any opposing views can go to hell.

                     ...and now, a word from our sponsor:
    "The opinions expressed are those of the author and do not necessarily
       represent those of UniSoft Corp, its staff, or its management."

agnew@trwrc.UUCP (R.A. Agnew) (08/21/87)

In article <12513@clyde.ATT.COM>, spf@moss.ATT.COM writes:
> In article <1065@vu-vlsi.UUCP> harman@vu-vlsi.UUCP (Glen Harman) writes:
> The DoD > doesn't much like C from a lifecycle point of view, but has trouble
> denying its availability and current performance advantage over Ada
> (just like assembly with respect to FORTRAN 20+ years ago).
> 
> Steve Frysinger

What performance advantage?? The DEC Vax Ada compiler generates tighter code than the 
Vax C compiler (no slouch) not to mention the fact that I generate code 5 to 10
times faster in Ada due to problem level abstraction and re-use.

webber@brandx.rutgers.edu (Webber) (08/21/87)

In article <7203@alice.UUCP>, ark@alice.UUCP writes:
> In article <12513@clyde.ATT.COM>, spf@clyde.UUCP writes:
> > Learn them both.  C is the assembly language, and Ada the High Order
> > Language (HOL), of the next ten years in the DoD community.
> 
> Ada is the Cobol of the 70's.

I always thought Ada was the PL/1 of the 80's.

[Anyway, it wasn't until the summer of 1979 that the Preliminary ADA
Reference Manual was available for public comment and certainly all
the environment that people are now claiming is part of the Ada
standard are creations of the 80's.]

------ BOB (webber@aramis.rutgers.edu ; rutgers!aramis.rutgers.edu!webber)

steve@gondor.psu.edu.UUCP (08/21/87)

In article <253@etn-rad.UUCP> jru@etn-rad.UUCP (0000-John Unekis) writes:

>	                                                          It was a
>	  moderately structured language , with a syntax that was similar
>	  to the UNIX c shell (or vice versa).

Vice versa, or so BSD claims.  If you ask me, however, I can't see much
of a connection between csh and C.  ...just an opinion.

>
>	  Because UNIX does not handle real-time applications (such as
>	  interrupt handling) very well,

Here I agree, but not because of what unix is, but how unix is implimented.
Interrupts are not all that clumsy to deal with in unix, and there
is actually even some structure and cleanliness here, but the process
structure of the unix kernal isn't.  A context switch is made a bit messy
by the fact that a process's having to modes: user mode and kernal mode.
An interrupt does not switch the context like it could, making the
process in kernal mode make the decision.  I can imagine a UNIX that
would handle interrupts efficiently, but the internals would be different.

>	                                                       As long as
>	  computers remain basically Von Neuman processors, no language is
>	  going to offer any advantages in the real world to a language
>	  like COBOL. No business is going to go through the 3 to 5 years
>	  effort of retraining and converting of existing code just to
>	  satisfy the dogmatic prejudices of computer-science weenies.

Lets hear it for Hypercubes and connection machines!
Also, being a computer-science weenie, I would like to say that the idea
is not to break people of the COBOL blues, but to start the new sites off
in the right direction!  (P.S.  I read in SIGOPS recently about some neat
unix [they call it AIX] stuff IBM is doing in Texas.

>	                                                       Therefore if
>	  you want a career in military/aerospace, go for ADA.

Well, I guess so.  I'm not qualified to judge.  I suppose ADA wouldn't
hurt your career.


--Steve

psuvax1!{ihnp4, moby!nssc}!journey!steve

smvorkoetter@watmum.waterloo.edu (Stefan M. Vorkoetter) (08/21/87)

In article <253@etn-rad.UUCP> jru@etn-rad.UUCP (0000-John Unekis) writes:
>	  The ada language is far more than just a language. Ada includes
>	  standards for editors,compilers, and run-time symbolic debuggers.

I have yet to see any kind of documentation for an Ada editor, compiler, or
run-time debugger.  The only document from the DoD that I am aware of is
the Ada Language Reference Manual.

>	                         There were, beleive it or not, an A language
>	  and a B language that preceded it.

The way I understand it, and from what I have read, there was first BCPL,
then B, and finally C.  What's next?  P?  There was a language called Eh
developed at the University of Waterloo in the 70's by M. A. Malcom, which
is somewhat C-like, but C already existed then.

>	                                             This standard was 
>	  named ADA, (the name of the mistress of Charles Babbage, who 
>	  invented a punched card driven loom, considered to be the first
>	  computer, she was rumored to be the first person to ever write
>	  a program on punched cards- why her name is appropriate for a
>	  real-time language is a mystery).

Ada was not Babbage's mistress, but just a friend of his.  She did not 
invent the card driven loom, some fellow named Jacquard did.  What she
did do is write programs for Babbage's Difference Engine, and his never
completed Analytical Engine.  (It is rumoured that she had a complete
implementation of the Star Trek game :-)  Her name is appropriate because
she was the first programmer.  Too bad they used it for such a horrid
language.

>	         Be aware that it is a very complex language

That's for sure.  Beats PL/I though.  The problem with Ada (as with PL/I) is
that it is so big, it is hard to ensure that one's compiler is reliable.  This
is ironic, since one of the aims of having a single programming language is
to reduce errors in coding, by having everyone think the same.  It is also
scary when you consider that they want to use it to control missile systems,
the star wars (note the lowercase, Star Wars was a movie) system, etc.
(Carnegie

>	                   Don't expect to see ADA used very widely outside
>	  of the DOD environment. It will fail for the same reason that
>	  Pascal, Modula2, C, PL1, and others have failed - IBM is the 
>	  dominant force in the commercial market(~75 percent of all 
>	  commercial installations) and COBOL dominates the IBM installed
>	  base (~90 percent of IBM applications are in COBOL). 

I was not aware that Pascal and C had failed.  I believe UNIX is written in
C, as is all the mail and news software that allows us to communicate these
conflicting views.  So is the C compiler, and the UNIX FORTRAN and Pascal
compilers.  So are most systems programs these days on most systems.  Pascal
is also alive and well.  I market software that is written in Turbo Pascal,
as do many others.  The TANGO printed circuit board layout program is written
in Turbo Pascal.  COBOL on the other hand is not a language that programs are
written in much any more.  Every person I know who has ever worked with COBOL
was doing maintenance.  No one I know has ever written anything in it.

>                                                           As long as
>	  computers remain basically Von Neuman processors, no language is
>	  going to offer any advantages in the real world to a language
>	  like COBOL. 

Really?  COBOL is a big kludgy language.  Nothing written in COBOL runs
very fast.  Do you think IBM's COBOL compiler is written in COBOL?  No way.
Do you think a terminal emulator for a PC written in COBOL would be able
to keep up at over 110 baud?  Try writing an interrupt handler in COBOL 
some day.  Or a C compiler.  Or a text editor.  Or an operating system.
COBOL is too suited for writing file handling applications and not very
well suited to writing anything else.

>                 No business is going to go through the 3 to 5 years
>	  effort of retraining and converting of existing code just to
>	  satisfy the dogmatic prejudices of computer-science weenies.

No, no business is going to do this.  Why should they?  The code works
as it is.  But very few are going to write new code in COBOL.  If COBOL
were so great, don't you think your "weenies" would be using it.  COBOL
is a dinosaur which has just not YET become extinct.  It will.  If it
wasn't for your "weenies" though, you wouldn't have COBOL, or computers.

>	                                             Therefore if
>	  you want a career in military/aerospace, go for ADA.

Unfortunately for the original poster, I must agree with this.  But, do
you really want a career in military programming?  Writing programs to
kill people just doesn't sound like a good idea?  Whatever happened to
the First Law of Robotics?

Stefan Vorkoetter
Dept. of Computer Science
University of Waterloo
Waterloo, Ontario
CANADA.

The opinions expressed herein are my own, and not those of my employer.  As
a matter of fact my employer used to teach COBOL so people could maintain
COBOL programs.  The COBOL course did not involve any WRITING of programs,
just modifying.  Now they don't teach COBOL any more.  But still, all the
opinions are mine.

mpl@sfsup.UUCP (M.P.Lindner) (08/21/87)

In article <253@etn-rad.UUCP>, jru@etn-rad.UUCP writes:
> 
> 	  The C language evolved at AT&T in the process of developing the
> 	  UNIX operating system. There were, beleive it or not, an A language
> 	  and a B language that preceded it. Finally with the C language the
> 	  original developer of the UNIX operating system (which was done on
> 	  a PDP-7 microcomputer) felt that he had what he wanted.

C was designed and implemented (originally) by Dennis Ritchie.  It was
designed as a systems programming language - something suitable for writing
applications like operating systems, compilers, utilities, and the like.
There was no "A".  Its evolution is FORTRAN -> BCPL -> B -> C (which leads
to speculation as to whether C's successor will be called "D" from "ABCD..."
or "P" form BCPL).  "B" was written by Ken Thompson, and BCPL by Martin
Richards, (and of course FORTRAN by Bachus (sp), so the ideas were *not*
that of one person.

> 	  moderately structured language , with a syntax that was similar
> 	  to the UNIX c shell (or vice versa). As UNIX gained wide acceptance

The "C" shell came much later - C predates UNIX, which predates the Bourne
shell, which predates the "C" shell.

> 	  in computer languages. C is now widely available outside the 
> 	  UNIX community and is a defacto standard with many companies. 
> 	  It is often the case in military/aerospace procurements that in
> 	  order to generalize the request for computer hardware so as not
> 	  to sole source a single vendor the government will ask for UNIX
> 	  because it is the only operating system that can be had on a 
> 	  non-proprietary basis on a variety of computer hardware. UNIX of
> 	  course brings C right along with it.

how true...

> 	  Because UNIX does not handle real-time applications (such as
> 	  interrupt handling) very well, and because there was no non-

whoah there! UNIX may not be real-time, but C certainly is.  Therefore I
claim that it made sense to develop a real-time operating system standard
(which was not done) rather than a language/environment standard (which
was done).

> 	  computer, she was rumored to be the first person to ever write
> 	  a program on punched cards- why her name is appropriate for a
> 	  real-time language is a mystery). If you are interested in the

I think her name is entirely appropriate for Ada, the "punch card" philospohy
language of the 80's.

> 	  satisfy the dogmatic prejudices of computer-science weenies.

oh yeah? :{)

Mike Lindner
...!ihnp4!attunix!mpl

spf@moss.ATT.COM (08/21/87)

In article <203@trwrc.UUCP> agnew@trwrc.UUCP (R.A. Agnew) writes:
>In article <12513@clyde.ATT.COM>, spf@moss.ATT.COM writes:
>> The DoD > doesn't much like C from a lifecycle point of view, but has trouble
>> denying its availability and current performance advantage over Ada
>> (just like assembly with respect to FORTRAN 20+ years ago).
>What performance advantage?? The DEC Vax Ada compiler generates tighter code than the 
>Vax C compiler (no slouch) not to mention the fact that I generate code 5 to 10
>times faster in Ada due to problem level abstraction and re-use.

This may well be true by now.  And that's my point.  In the earliest
days of High Order Languages, assembly language was perceived to
offer a performance advantage, albeit with a nuisance factor.  My
claim is that C is in that position now.  Sure, some Ada compilers
(probably the DEC compiler gets the best performance reviews I've
seen) will out-perform some C compilers (don't know about DEC's).
On my PDP-11/23 at home, my DEC Pascal compiler beats the pants off
Whitesmith's C.  This has little to do with the languages, and much
to do with both the quality of the compilers and the architecure of
the target machines.

Anyway, I still think you'd be wise to learn both (my "pet" language
is Pascal 'cause I can throw code together fairly casually, and
the compiler will tell me if I did something dumb; but I'm learning
C and Ada anyway, and I don't even want to be a programmer when I grow
up!)

By the way, it's worth pointing out that this whole discussion more
or less ignores the advanced computing architectures found in DoD
environments.  Anybody out there programming parallel processors
in Ada?  Or C?  There's a lot of microcode floating around yet,
not to mention things like the dataflow language SPGN, &c.  This
whole language discussion has kind of neglected the fact that
computer architecture (especially for high-performance embedded
systems) is in a transition period too.

Steve Frysinger

crowl@rochester.UUCP (08/21/87)

In article <253@etn-rad.UUCP> jru@etn-rad.UUCP (0000-John Unekis) writes:
>The Ada language is far more than just a language.  Ada includes standards for
>editors, compilers, and run-time symbolic debuggers.

Ada is just a language.  The Ada Programming Support Environment is a standard
for all that other stuff.

>The C language evolved at AT&T in the process of developing the UNIX operating
>system.  There were, beleive it or not, an A language and a B language that
>preceded it.

There was no A language.  C derived from B which derives from BCPL.  Algol
enters the picture somewhere.

>Finally with the C language the original developer of the UNIX operating
>system (which was done on a PDP-7 microcomputer) felt that he had what he
>wanted.  It was a moderately structured language, with a syntax that was
>similar to the UNIX c shell (or vice versa).

The Unix C shell came much later.

>As UNIX gained wide acceptance the C language became more popular.  It has the
>advantage over FORTRAN of having structured variable types, but without the 
>overly elaborate type checking done by a language like PASCAL.

Pascal does not have overly elaborate type checking.  It lacks escapes from the
type checking, but the checking itself is at about the right level.  Note that
Ada adopts this same level of checking.

>It does share with Pascal the fact that each was developed by a single
>individual and thus represents that individuals prejudices in computer
>languages.

C was developed over time with the input of many individuals.  Kernighan and
Ritche are acknowledged "prime" movers, but you cannot say that C was developed
by a single individual.

>This standard was named ADA, (the name of the mistress of Charles Babbage,
>who invented a punched card driven loom, considered to be the first computer,
>she was rumored to be the first person to ever write a program on punched
>cards- why her name is appropriate for a real-time language is a mystery).

Ada Augusta Lovelace, daughter of Lord Byron, as an associate of Babbage.  I
do not remember reading anything that indicated she was his mistress.  Charles
Babbage DID NOT invent the punched card driven loom, it was invented by Jaquard
in the 1700's.  The loom was not the first computer.  The first computer was
(arguably) Babbage's Analytic Engine, which was never built.  The machine was
driven by cards, but since it was never built, I doubt Ada ever punched a card.
She did write programs (on paper) for the machine.  Ada Lovelace was the first
programmer, so it is reasonable to name a programming language after her.

>(Carnegie-Mellon University is rumored to have required it for all sophomores
>which resulted in flunking out half their sophomore class)

Given the accuracy of the previous statements, I tend to doubt this one too.

>[Ada] will fail for the same reason that Pascal, Modula2, C, PL1, and others
>have failed - IBM is the dominant force in the commercial market (~75 percent
>of all commercial installations) and COBOL dominates the IBM installed base
>(~90 percent of IBM applications are in COBOL).

Pascal and C are VERY SUCCESSFUL.  PL/1 and Modula-2 have NOT FAILED by any
stretch of the imagination.  LOTS of programs are written in these languages.
There are a LOT of IBM applications written in Fortran.

Excluding microcomputers, DEC has sold far computers than IBM.  (They are not
as big, but that's not my point.)  I doubt they are anywhere near 75% of all
commercial installations unless by commercial you mean "payroll department"
instead of "corporation".

You are stating that a MINIMUM of 67% of all applications are written in COBOL.
Please back this up.

>As long as computers remain basically Von Neuman processors, no language is
>going to offer any advantages in the real world to a language like COBOL.

Were you asleep when you wrote this?  The DoD may not be very bright, but they
certainly would no have spent millions of dollars developing Ada to get a
language that offered no advantages over Cobol.  Nor would substantial research
in programming have resulted in so many good alternatives.

>No business is going to go through the 3 to 5 years effort of retraining and
>converting of existing code ...

If it takes a company 3 to 5 years to retrain they should hire new personel.
No one advocates converting existing code just to have it in a different
language.  However, many people recommend coding new applications in newer
languages so that the benifits of modern programming languages can be realized.

>... just to satisfy the dogmatic prejudices of computer-science weenies.

Well the computer scientists have changed their opinions since 1960, you have
not.  Which indicates a prevalence for dogmatism.  Does "no language is going
to offer any advantages ... [over] Cobol" sound like dogmatism?  Yes.

Your "weenies" attitude is equivalent to "anyone who uses brakes in a car is a
weenie."  Computer scientists advocate better programming languages because
they make programming less expensive and result in products with fewer bugs.
Are these admirable goals?  I think so.

-- 
  Lawrence Crowl		716-275-8479	University of Rochester
		     crowl@cs.rochester.arpa	Computer Science Department
 ...!{allegra,decvax,seismo}!rochester!crowl	Rochester, New York,  14627

hitchens@godzilla.UUCP (08/22/87)

In article <1573@sol.ARPA> crowl@cs.rochester.edu (Lawrence Crowl) writes:

  [lots of other stuff deleted]

>Ada Augusta Lovelace, daughter of Lord Byron, was an associate of Babbage.  I
>do not remember reading anything that indicated she was his mistress.  Charles
>Babbage DID NOT invent the punched card driven loom, it was invented by Jaquard
>in the 1700's.  The loom was not the first computer.  The first computer was
>(arguably) Babbage's Analytic Engine, which was never built.  The machine was
>driven by cards, but since it was never built, I doubt Ada ever punched a card.
>She did write programs (on paper) for the machine.  Ada Lovelace was the first
>programmer, so it is reasonable to name a programming language after her.

>  Lawrence Crowl		716-275-8479	University of Rochester
>		     crowl@cs.rochester.arpa	Computer Science Department
> ...!{allegra,decvax,seismo}!rochester!crowl	Rochester, New York,  14627

   Ada Lovelace was quite a remarkable person.  She was a world-class 
mathematician and produced a significant body of work.  It's remarkable
that she did this at a time when women weren't exactly welcome in the sciences,
but it becomes amazing when you learn that she suffered from very poor health,
enduring severe and nearly continuous migrain headaches.  I believe she died
fairly young (mid-30s I think) because of her poor health.

   As for whether she was Babbage's mistress, I don't know, and what does
it really matter?  Being the daughter of Lord Byron, it's probably inevitable
that people of the day would think so, whether it was true or not.  In any
case, Ada Lovelace is worthy of recognition for her professional life
regardless of what she did in her personal life.

Ron Hitchens			hitchens@godzilla.cs.utexas.edu
 Another one of those		hitchens@sally.utexas.edu
 annoying consultants		hitchens@ut-sally.uucp

"I spilled Spot remover on my dog.  Now he's gone."  -Steven Wright

roy@phri.UUCP (Roy Smith) (08/22/87)

In article <1573@sol.ARPA> crowl@cs.rochester.edu (Lawrence Crowl) writes:
> The DoD may not be very bright, but they certainly would no have spent
> millions of dollars developing Ada to get a language that offered no
> advantages over Cobol.

	Without passing judgement (good or bad) on anything else Lawrence
said, I find the idea that "just because DoD spent millions of dollars to
do something, it must have gotten its money's worth" to be utterly,
completely, and absolutely hysterical.
-- 
Roy Smith, {allegra,cmcl2,philabs}!phri!roy
System Administrator, Public Health Research Institute
455 First Avenue, New York, NY 10016

gwyn@brl-smoke.UUCP (08/22/87)

In article <12659@clyde.ATT.COM> spf@moss.UUCP (Steve Frysinger) writes:
>Anybody out there programming parallel processors in Ada?  Or C?

Yes, we use C for that.  Ada(tm) makes it hard to escape from its
default models, which we find unsuitable for our needs.  In
particular, its "rendezvous" mechanism is NOT what we need.
(I certainly don't want or need the ADAPSE either!)

Ada will "succeed" regardless of its technical merit, given that the
DOD is cramming it down programmers' throats.  C is succeeding also,
for different reasons.  COBOL, FORTRAN, BASIC, and LISP have all
been successful in various ways.  What does this prove?  Probably
nothing, other than that there is room for multiple languages.

I will say, however, that the day that a bureaucrat insists that I use
a particular programming language, when I'm in a position to assess
its suitability and he is not, I will quit working for the bureaucrat.
Unfortunately most programmers in the military-industrial complex do
not feel that they can or should maintain their professional integrity.

Ada's supposed product life-cycle economic advantages are spurious --
those advantages are primarily due to structured software development
methodology, which can be applied independent of choice of programming
language.  We're currently in the middle of a large structured
development project that happens to rely on C as its main
implementation language.  Changing to Ada would gain very little in
this area, and would be unwise for many other reasons.

Can we please quit discussing Ada in the C newsgroup?  There is an
Ada newsgroup, you know.

gwyn@brl-smoke.ARPA (Doug Gwyn ) (08/22/87)

In article <1146@watmum.waterloo.edu> smvorkoetter@watmum.waterloo.edu (Stefan M. Vorkoetter) writes:
>Beats PL/I though.

The one thing everybody seems to agree on!

>The problem with Ada (as with PL/I) is that it is so big, it is hard to
>ensure that one's compiler is reliable.

Size has little to do with this.  As the people from Metaware would
probably tell you, what is necessary is a formal semantic specification
against which the implementation may be gauged.

>But, do you really want a career in military programming?  Writing
>programs to kill people just doesn't sound like a good idea?

Most programming for the military has nothing to do with killing.
There are also many people who have no problem with the notion of
killing under certain circumstances (for example, in self-defense).
The moral and ethical issues certainly need consideration, but each
individual should make his own determination.  In my case, for
example, I'm sure that my efforts for the military have not added
to the expected total number of people killed in the future, and may
have reduced the expected number.  (A truly viable defense would do
that.)

eric@sarin.UUCP (Eric Beser sys admin) (08/23/87)

In article <2231@cbmvax.UUCP>, daveh@cbmvax.UUCP (Dave Haynie) writes:

> The biggest problems with Ada is the massive size of its compiler and the
> verbosity of it's language.  I guess if you really like Pascal or Modula2,
> you might adjust, but if you're used to C, it might take some getting used
> to.  

      I am the Ada Technology Coordinator for a major corporation. Part of my
responsibility is to evaluate and select compilers, evaluate Ada runtime
environments, and to guide our Ada insertion effort. Mr. Haynie having
just written this "Ada Bashing" deserves a proper response. His response
was filled with old information, no longer valid today. My suggesting is
that you take an open mind to the Ada vs C issue, because it is not
as Mr. Haynie suggests.


   Working in an industrial environment, and having used Ada, I do not see
how the "size" of a compiler matters in producing code that is efficient,
that is easily maintainable, and documentable. If you are used to C, you
may have trouble adjusting to the strong typing of the language. Ada is
a verbose language (if rich can be stated in any other words). But it is that
richness that makes the language so useful.

> 
> As for where it's used, mainly DOD I'd guess.  It certainly isn't used much,
> if any, in commercial or industrial sectors.  C's the language for most of
> these, though your Fortran experience could come in handy in some heavy
> duty scientific fields (most machines have Fortran compilers that code better
> than existing C compilers, but since there's more work being done on C, I
> wouldn't be surprised if this is changing).  Ada is certainly general purpose
> enough to be used elsewhere if you have access to a compiler for it, and it
> generally has alot of things built into it that you have to add to C language
> (like tasks, exceptions, etc.).

  I get irked when I hear this. This may have been true a year ago, but no
longer. Many customers (non DOD or NATO) are requesting Ada because of the
software engineering that it dictates. Ada is not just a computer language,
it is a software engineering methodology. Many customers who originally
requested C are now asking for Ada. The compilers today are becoming efficient.
There are some imperfections however, and all is not rosy. But there are
compilers for the 68020 that produce well optimized code. There are compilers
for the 1750A military processor that produced adequate code, although 
not as well optimized. Toolsets are now usable. In fact, you can buy a 
validate Ada compiler for the IBM PC-XT (Meridian) that does not require
extra memory, is somewhat expensive (700.00), but not the $3000.00 that
the leading pc vendor charges.


Let me give an example from actual experience.

  I am part of a large Ada project that is building a tool for use by our
engineers (industrial, commercial, as well as DOD contracts). After coding
this tool, we determined that some inefficiencies dictated a design change.
This design change proliferated itself through the entire system. The coding
for this change took about a man-month, but the debug and reintegration
phase took only two days. The system ran as before, must more efficient
due to the design change. Had the code been written in C, this would not
have happened. Many of the errors that were interjected by the engineers
were due to spelling, wrong variable selection, or unhandled boundary
conditions. These errors would not have come out during C compilation.
They would have manifested themselves in various (or nefarious) bugs. These
interjected errors were found (80% approx) during compilation of the code.
An additional 15% were found on first startup of the system by constraint
and unhandled exceptions. The remainder were found on integration.

My suggestion is that you learn software engineering, object oriented
design, and the use of software tools. Then learn Ada. You will find
that the C will come in handy when you have to try to understand what
some engineer did so to do an Ada redesign.

Eric Beser
EBESER@ADA20   - arpa
{mimsy}!aplcen!cp1!sarin!eric - usenet

henry@utzoo.UUCP (Henry Spencer) (08/23/87)

Sigh.  Was it really necessary to post 80+ lines of speculation and
misinformation?  There was no "A" language; "B" was named after BCPL.
Unix was developed by *two* people, not one:  Ken Thompson and Dennis
Ritchie.  The PDP-7 was not a microcomputer.  C evolved mostly after
Unix moved to the PDP-11 (which wasn't a microcomputer either -- the
word didn't even exist then).  The C Shell came later and was modeled
on C, not vice-versa.  C has spread far beyond Unix, with good reason.
Unix is *not* inherently poor at real-time applications (although this
is a common misconception, and it's true that custom-designed real-time
systems do better).  Many real-time systems are written in C nowadays,
since it works just fine without Unix underneath.  Ada was (officially,
at least) motivated almost entirely by the language diversity problem,
since it has little or nothing to do with the lack of a non-proprietary
real-time operating system.  The attempt to standardize things like the
Ada development environments came later, not first.  Ada was chosen as a
language name not because she was Babbage's mistress (she wasn't), but
because in certain ways she was the world's first programmer.  Babbage
invented (again, in certain ways) the computer, and did not invent the
punchcard-controlled loom.

Two things were correct:  Ada (which is not an acronym and should not be
written in capital letters) is important to government contractors because
DoD is jamming it down peoples' throats, and its future elsewhere is not
yet clear.
-- 
Apollo was the doorway to the stars. |  Henry Spencer @ U of Toronto Zoology
Next time, we should open it.        | {allegra,ihnp4,decvax,utai}!utzoo!henry

henry@utzoo.UUCP (Henry Spencer) (08/23/87)

> > The DoD doesn't much like C from a lifecycle point of view, but has trouble
> > denying its availability and current performance advantage...
> 
> What performance advantage?? The DEC Vax Ada compiler generates tighter
> code than the  Vax C compiler (no slouch)...

This says more about the relative investment in the compilers than about
the languages.  DEC has a history of being unenthusiastic about C; it shows.
-- 
Apollo was the doorway to the stars. |  Henry Spencer @ U of Toronto Zoology
Next time, we should open it.        | {allegra,ihnp4,decvax,utai}!utzoo!henry

wyatt@cfa.UUCP (08/23/87)

Re: the original posting of misinformation about the Countess Lovelace
(and C, Unix, and everything else, just about):

Jacquard invented the card-programmable loom, and is arguably the first
hacker, since one of the things he did was to program it to weave a
portrait of himself! (source: BIT by BIT, Stan Augarten)
-- 

Bill    UUCP:  {seismo|ihnp4}!harvard!cfa!wyatt
Wyatt   ARPA:  wyatt@cfa.harvard.edu
         (or)  wyatt%cfa@harvard.harvard.edu
      BITNET:  wyatt@cfa2
        SPAN:  17410::wyatt   (this will change in June)

mpl@sfsup.UUCP (M.P.Lindner) (08/24/87)

In article <6323@brl-smoke.ARPA>, gwyn@brl-smoke.ARPA (Doug Gwyn ) writes:
> In article <1146@watmum.waterloo.edu> smvorkoetter@watmum.waterloo.edu (Stefan M. Vorkoetter) writes:
> >The problem with Ada (as with PL/I) is that it is so big, it is hard to
> >ensure that one's compiler is reliable.
> 
> Size has little to do with this.  As the people from Metaware would
> probably tell you, what is necessary is a formal semantic specification
> against which the implementation may be gauged.

Oh, but size has *everything* to do with it.  Or, actually, complexity.
Try desribing a test suite for a factorial program which handles input from
1 to 10, then try a test suite for Ada.  Betcha I can guess that you can
find every error in the factorial program, but at least one is left in the Ada
compiler.

palmer@tybalt.caltech.edu (David Palmer) (08/25/87)

In article <36@sarin.UUCP> eric@sarin.UUCP (Eric Beser sys admin) writes:
>
>   Working in an industrial environment, and having used Ada, I do not see
>how the "size" of a compiler matters in producing code that is efficient,
>that is easily maintainable, and documentable....
>
>		... Toolsets are now usable. In fact, you can buy a 
>validate Ada compiler for the IBM PC-XT (Meridian) that does not require
>extra memory, is somewhat expensive (700.00), but not the $3000.00 that
>the leading pc vendor charges.

Sounds really great.  When does LightspeedAda come out for the Macintosh?
I'd like to fit the whole system on a 500Mbyte RAMdisk and be able to
recompile my programs and start them running in 3 seconds from editor
to application.

No? then I guess I'll have to stick with Borland's TurboAda, not quite
as good, of course, but you can't have everything.  I consider any
compiler usable if it doesn't require an inordinate amount of time
to compile, where I define inordinate based on past experience
with other languages.  Of course, it also helps if I don't need a full
page display terminal to understand a short segment of code.
(Verbose is NOT a synonym for rich.)

		David Palmer
		palmer@tybalt.caltech.edu
		...rutgers!cit-vax!tybalt.caltech.edu!palmer
	The opinions expressed are those of an 8000 year old Atlantuan
	priestess named Mrla, and not necessarily those of her channel.

jru@etn-rad.UUCP (John Unekis) (08/25/87)

In article <650@cfa.cfa.harvard.EDU> wyatt@cfa.harvard.EDU (Bill Wyatt) writes:
>
>Re: the original posting of misinformation about the Countess Lovelace
>(and C, Unix, and everything else, just about):
>...

In response to the criticism that I have received for 
calling Ada Lovelace the mistress of Charles Babbage I went 
out and did a little research. As it turns out , Charles 
Babbage had concieved an idea for a mechanical differencing 
engine and was able to convince the British government to 
fund its development. This engine was to use punched cards 
and was inspired by the loom of Joseph Jacquard, although 
Babbages machine was essentially an adding machine. While 
the difference engine was still unfinished Babbage came up 
with an idea for a new machine which he dubbed the 
analytical engine and began to divert his efforts to the new 
design. He ran out of funding and managed to alienate his 
friends in Parliament in his attempts to drum up more 
support. He lost the engineers who were working on the 
machine because he couldn't pay them, and in leaving they 
took all his machine tools in place of back pay. At this 
point he met Ada. She was actually Ada Augusta King, the 
Countess of Lovelace , daughter of Lord Byron. She was still 
a teenager at the time, and was a serious student of 
mathematics. She wanted to be tutored in his methods, and 
since she was rich and he was broke, it seemed like a good 
match. They worked together for many years and she was 
credited with such ideas as the use of binary instead of 
decimal numbers for calculation. They were seen together 
socially, although her status as his lover is open to 
debate. Ada died young(36) of a unknown disease. She 
attempted to leave her fortune to Babbage, but her family 
prevented it. After her death Babbage worked odd jobs to 
support himself, but he was unable to complete either of his 
calculating engines. He died in 1871, but his notes were rediscovered in 1937, 
and he has been credited with inventing the digital 
computer.

Consider the following-

Ada Lovelace was a woman who spent her time and money on a 
computer project that :

A) Was probably not possible with existing technology 
(Babbage had no end of trouble getting precisely machined 
gears).

B) Was completely redesigned while it was already slipping 
its development schedule.

C) Ran over budget, and had to apply to the government for
additional funding.

D) Was denied additional funds and had to lay off its 
engineers.

E) Never delivered a working computer system.


How could I have been so blind. I take back everything I 
said about her. Ada is the perfect name for a computer 
language that will be the standard for the U.S. military and
aerospace contracting industries.

-------------------------------------------------------------------
Disclaimer: My employer did not authorize my actions, I have
acted on my own initiative.

dca@kesmai.COM (David C. Albrecht) (08/25/87)

> > What performance advantage?? The DEC Vax Ada compiler generates tighter
> > code than the  Vax C compiler (no slouch)...

> This says more about the relative investment in the compilers than about
> the languages.  DEC has a history of being unenthusiastic about C; it shows.
> -- 

Well then if DEC is unenthusiastic about C the people responsible for unix
(B&B) must be somnabulistic since the VAX C compiler beats the pants
off of pcc.

David Albrecht

gwyn@brl-smoke.ARPA (Doug Gwyn ) (08/26/87)

In article <36@sarin.UUCP> eric@sarin.UUCP (Eric Beser sys admin) writes:
>I get irked when I hear this. This may have been true a year ago, but no
>longer...
>Ada is not just a computer language,
>it is a software engineering methodology.

It does amount to that.  It also gives one little choice about
the matter; it's designed to enforce one particular methodology.
If one has different ideas (as many workers in software engineering
do), then it becomes a battle against Ada's built-in model.

>... The system ran as before, must more efficient due to the design
>change. Had the code been written in C, this would not have happened.

"I get irked when I hear this. This may have been true a year ago,
but no longer..."  Actually it need never have been true.  Read on.

>Many of the errors that were interjected by the engineers
>were due to spelling, wrong variable selection, or unhandled boundary
>conditions. These errors would not have come out during C compilation.
>They would have manifested themselves in various (or nefarious) bugs. These
>interjected errors were found (80% approx) during compilation of the code.
>An additional 15% were found on first startup of the system by constraint
>and unhandled exceptions. The remainder were found on integration.

The language the code is written in is really a very small part of
structured software methodology; one can apply the techniques in
COBOL, PL/I, FORTRAN, Pascal, or C.  As a matter of fact, there are
C environments for sale that are just about as "safe" as Ada; most
typo and similar errors are caught by C compilation and "lint";
unhandled exceptions in C typically produce an immediate core image
for debugging, etc.  Very little difference, really.

>My suggestion is that you learn software engineering, object oriented
>design, and the use of software tools. Then learn Ada. You will find
>that the C will come in handy when you have to try to understand what
>some engineer did so to do an Ada redesign.

Ada is worth learning, simply because it will see widespread use.
Structured software development methodology is also worth learning
(Yourdon recommended).  The two are not synonymous.  Nor are C
programming and random hacking.  You probably don't see much of
the really well-implemented systems programmed in C, not because
they don't exist, but because they are likely to be proprietary.
The "freeware" you see on the net is not typically the product of
informed software engineering, but just someone's quick solution
to an isolated problem they ran into.  Don't judge the language's
capabilities based on that sort of evidence.

C's one really major problem from the structured viewpoint is
that it has only two effective levels of name space, module
internal and global.  This makes it necessary to strictly
allocate global names, which is an annoying but solvable problem.

jesup@steinmetz.steinmetz.UUCP (Randell Jesup) (08/26/87)

[ To give my comments perspective, I have been programming in ADA ]
[ professionally, in a government DARPA project.		  ]

In article <36@sarin.UUCP> eric@sarin.UUCP (Eric Beser sys admin) writes:
>In article <2231@cbmvax.UUCP>, daveh@cbmvax.UUCP (Dave Haynie) writes:
>
>> The biggest problems with Ada is the massive size of its compiler and the
>> verbosity of it's language.  I guess if you really like Pascal or Modula2,
>> you might adjust, but if you're used to C, it might take some getting used
>> to.  
>
>   Working in an industrial environment, and having used Ada, I do not see
>how the "size" of a compiler matters in producing code that is efficient,
>that is easily maintainable, and documentable. If you are used to C, you
>may have trouble adjusting to the strong typing of the language. Ada is
>a verbose language (if rich can be stated in any other words). But it is that
>richness that makes the language so useful.

	The size of the compiler DOES matter, even in an industrial 
envirionment.  The Vax Ada compiler, supposedly the best out there (according
to a well-informed source on the original panel/whatever), is one of the
things that forced us to quadruple our machines memory.  It was not the
only thing, but it was so bad we were doing batch compiles at night!
(Shades of the early 70's!)  Otherwise, we'd get 500 page faults/cpu sec.
	Concerning the efficiency of the generated code, our absract data
types and the operations on them took well over 1/2 meg, not including the
program that would use them.  By comparison, the initial quick-hacked C
version done for the phase 1 final review (in 12 man-weeks!) was about
300-400K (with a fair amount of duplicate code, like parsers in a number of
modules).  This is obviously not a very scientific comparision, but it does
show something (at least that it caused much more disk use).

>> As for where it's used, mainly DOD I'd guess.  It certainly isn't used much,
>> if any, in commercial or industrial sectors.  C's the language for most of
>> these, though your Fortran experience could come in handy in some heavy
>> duty scientific fields (most machines have Fortran compilers that code better
>> than existing C compilers, but since there's more work being done on C, I
>> wouldn't be surprised if this is changing).  Ada is certainly general purpose
>> enough to be used elsewhere if you have access to a compiler for it, and it
>> generally has alot of things built into it that you have to add to C language
>> (like tasks, exceptions, etc.).
>
>  I get irked when I hear this. This may have been true a year ago, but no
>longer. Many customers (non DOD or NATO) are requesting Ada because of the
>software engineering that it dictates. Ada is not just a computer language,
>it is a software engineering methodology. Many customers who originally
>requested C are now asking for Ada. The compilers today are becoming efficient.
>There are some imperfections however, and all is not rosy. But there are
>compilers for the 68020 that produce well optimized code. There are compilers
>for the 1750A military processor that produced adequate code, although 
>not as well optimized. Toolsets are now usable. In fact, you can buy a 
>validate Ada compiler for the IBM PC-XT (Meridian) that does not require
>extra memory, is somewhat expensive (700.00), but not the $3000.00 that
>the leading pc vendor charges.

	But how compact are the programs produced by it? And how fast?

>Let me give an example from actual experience.
>
>  I am part of a large Ada project that is building a tool for use by our
>engineers (industrial, commercial, as well as DOD contracts). After coding
>this tool, we determined that some inefficiencies dictated a design change.
>This design change proliferated itself through the entire system. The coding
>for this change took about a man-month, but the debug and reintegration
>phase took only two days. The system ran as before, must more efficient
>due to the design change. Had the code been written in C, this would not
>have happened. Many of the errors that were interjected by the engineers
>were due to spelling, wrong variable selection, or unhandled boundary
>conditions. These errors would not have come out during C compilation.

	Partially true.  Any person who does not use lint on all code in
a large (or even small) project deserves what they get.  Lint catches many
things, though not as many as Ada.  If the system has been designed using
good methodology, it should be fairly easy to ripple changes through
whether it's in Ada or C.  Also, many errors I have seen in Ada programs
are also subtle, and often due to the immensity of the language, and the
trouble figuring out how it should behave when using various constructs
together.  (See many of the past messages here for examples).

>They would have manifested themselves in various (or nefarious) bugs. These
>interjected errors were found (80% approx) during compilation of the code.
>An additional 15% were found on first startup of the system by constraint
>and unhandled exceptions. The remainder were found on integration.

	The thing that causes the most errors that I have seen are Ada
compilers that don't warn you when you access an unitialized variable, therefor
causing random failures (the worst kind!)  Even my micro-computer C compiler
warns me of these (without even running lint.)  In fact, I saw people try to
figure out whether integers (and variables in general) were initialized
for several hours (I HATE the LRM!)
	If the government is concerned with reliability of software, they'd
better get some compilers that can find this, or it's all been for naught.
(I know it's easy, is there something in the language spec that says not
to warn the user?!)

>My suggestion is that you learn software engineering, object oriented
>design, and the use of software tools. Then learn Ada. You will find
>that the C will come in handy when you have to try to understand what
>some engineer did so to do an Ada redesign.

	Ditto about learning good design.  C is useful in many other situations
where Ada is not, and vice versa.

	I believe that for each problem (and set of contraints, such as time/
resources/etc), there is a most appropriate language, from FORTRAN to Ada to
C to Cobol (Yuch!) to assembler to Prolog to Lisp (etc).  The worst code 
you'll see is when someone tries to fit a XXXX language slipper (solution)
on a YYYY language foot (problem).  It may very well be that for the major
class of problems the DOD has that Ada is an appropriate solution.  But
certainly not for all problems the DOD has, and absolutely not for all 
problems anywhere.

>Eric Beser
>EBESER@ADA20   - arpa
>{mimsy}!aplcen!cp1!sarin!eric - usenet

	Randell Jesup
	jesup@steinmetz.UUCP
	jesup@ge-crd.arpa

spf@clyde.UUCP (08/26/87)

In article <3755@cit-vax.Caltech.Edu> palmer@tybalt.caltech.edu.UUCP (David Palmer) writes:
>
>I consider any
>compiler usable if it doesn't require an inordinate amount of time
>to compile, where I define inordinate based on past experience
>with other languages.  Of course, it also helps if I don't need a full
>page display terminal to understand a short segment of code.
>(Verbose is NOT a synonym for rich.)

You've just described Meridian's Ada.  I have it running on my AT&T
PC 6300 (640K RAM), and it compiles and links in roughly the same
time as Microsoft's C and Pascal systems.  Maybe Turbo* is faster,
but this is fast enough!  And if you know how to write modular code,
a 25 line crt is quite sufficient to apprehend a meaningful code
segment.  If you don't, then remind me not to hire you to write
code for any product which requires maintenance and upgrades.

mpl@sfsup.UUCP (M.P.Lindner) (08/26/87)

[ the following are excerpts - see the original articles for context ]

In article <36@sarin.UUCP>, eric@sarin.UUCP writes:
> In article <2231@cbmvax.UUCP>, daveh@cbmvax.UUCP (Dave Haynie) writes:
> 
> > The biggest problems with Ada is the massive size of its compiler and the
> > verbosity of it's language.  I guess if you really like Pascal or Modula2,
> > you might adjust, but if you're used to C, it might take some getting used
> > to.  
> 
> My suggesting is
> that you take an open mind to the Ada vs C issue, because it is not
> as Mr. Haynie suggests.

OK, although I must confess a love of "C".  Most of this love comes from
being able to write a program in minutes, then refine it.  C is concise,
clean, and unrestrictive.  I'm a Pascal convert, so I've used languages
which enforced typing.

>    I do not see
> how the "size" of a compiler matters in producing code that is efficient,
> that is easily maintainable, and documentable.

It doesn't.  The comment was that the size of the compiler makes it harder
to verify the *compiler*.

> If you are used to C, you
> may have trouble adjusting to the strong typing of the language. Ada is
> a verbose language (if rich can be stated in any other words). But it is that
> richness that makes the language so useful.

True, but I mistrust excess complexity in languages that try to be "all things
to all developers".

> Many customers (non DOD or NATO) are requesting Ada because of the
> software engineering that it dictates. Ada is not just a computer language,
> it is a software engineering methodology.
> The compilers today are becoming efficient.

OK, I'll buy that.  The methodology of the Ada environment is sound, if a
little restrictive.  I do take exception, however to the statements that
Ada is an "object oriented" language.  Yes, it is, but it was implemented
no better than that of C++.  Example:  Try to make a generic merge sort
which can sort floats and integers.  Unless this has been fixed since I
learned Ada, it can't be done!  The reason I was given is a generic type
can only assume assignment and equality.  If we add constraints, we must
specify either integral types or floating types.  Little inconsistancies
make life hell for programmers, as well as conflicting with the stated intent
of the methodology.

> After coding
> this tool, we determined that some inefficiencies dictated a design change.
> This design change proliferated itself through the entire system. The coding
> for this change took about a man-month, but the debug and reintegration
> phase took only two days. The system ran as before, must more efficient
> due to the design change. Had the code been written in C, this would not
> have happened.

Oh, and did you implement it in C, or is this the same kind of closed minded
opinionism you attacked earlier?  I claim the ease of reintegration is due
to methodology, and the same methodology is applicable to many languages,
not just Ada.  Not only that, but had the projract been done in C, a "quickie"
implementation of the algorithms could have been implemented, which might
have shown the problem before it was proliferated throughout the entire system
(not to mention that fact that a design should not "proliferate" through an
entire system if the design methodology is sound).

> Many of the errors that were interjected by the engineers
> were due to spelling, wrong variable selection, or unhandled boundary
> conditions. These errors would not have come out during C compilation.
> They would have manifested themselves in various (or nefarious) bugs. These
> interjected errors were found (80% approx) during compilation of the code.

Again, I musst take exception, since a good methodology would include linting
code, which would indeed have uncovered these errors (and do it without
imposing extra restraints on the developers).

> My suggestion is that you learn software engineering, object oriented
> design, and the use of software tools. Then learn Ada.

I have studied software engineering, object oriented, design, and the use
of software tools.  Then I learned Ada.  I still have nightmares.

Mike Lindner
attunix!mpl

kwh@sei.cmu.edu (Kurt Hoyt) (08/26/87)

In article <1573@sol.ARPA> crowl@cs.rochester.edu (Lawrence Crowl) writes:
>>(Carnegie-Mellon University is rumored to have required it for all
>>sophomores which resulted in flunking out half their sophomore class)
>
>Given the accuracy of the previous statements, I tend to doubt this one too.
>

I did some checking and the above incident NEVER HAPPENED. I don't know the
origin of the legend, but it is ENTIRELY FALSE (at least the part about it
happening at CMU). Rumors like this one can be damaging to the reputations
of both Ada and Carnegie-Mellon. Go easy on the use of rumors and
unsubstantiated claims, please.

--
"Just don't reverse the 	Kurt Hoyt
polarity of the neutron flow."	ARPA:   kwh@sei.cmu.edu
   -- The Doctor		BITNET: kwh%sei.cmu.edu@vma.cc.cmu.edu
				CSNET:  kwh%sei.cmu.edu@relay.cs.net

daveh@cbmvax.UUCP (Dave Haynie) (08/26/87)

in article <36@sarin.UUCP>, eric@sarin.UUCP (Eric Beser sys admin) says:
> Summary: not true ... not true
> Xref: cbmvax comp.lang.ada:573 comp.lang.c:3880 sci.space:2640 sci.space.shuttle:278
> 
>    Working in an industrial environment, and having used Ada, I do not see
> how the "size" of a compiler matters in producing code that is efficient,
> that is easily maintainable, and documentable. 

That's not what I said, and hopefully not what I implied.  Once you learn
the language, the size of the compiler has nothing at all to do with producing
code that's easily maintainable and documentable, for certain.   It may have
something to do with the efficiency of the code, but that's more an 
implementation detail.  They are certainly building compilers that code as
efficiently as C for Ada these days.

However, the size of a compiler does have everything to do with the correctness
of the object code that the compiler will produce.  Especially once you start
adding in global optimizers and other items designed to produce much more
efficient code.  A larger compiler just has to be more complex than a smaller
compiler, and thus it is more prone to containing errors that were not
caught during validation.  Because a complete validation test is very hard to
design.  Not impossible, but difficult.  And you may not learn of the bug,
especially in the case of languages used in Space and other hostile environs,
until you hit a special case 100 million miles from Earth.

> If you are used to C, you
> may have trouble adjusting to the strong typing of the language. Ada is
> a verbose language (if rich can be stated in any other words). But it is that
> richness that makes the language so useful.

What I mean by verbose is that it's wordy.  Perhaps not as bad as M2 or Pascal,
but it's wordy.  Statement vs. Operation oriented.  So while my C function 
appears as a whole in my Emacs window, my Ada function fills three screens.
That's completely independent of richness.  And as richness goes, C does
pretty well as far as operator richness that Ada doesn't seem to consider
necessary.  All I'm saying is that that's what a C programmer will object
to (I know this, I programmed in Pascal for 4 years).  I use a C with ANSI
prototyping; gives me the same level of type checking as I'd get in Ada or 
Pascal, same level of data abstraction available in Pascal.  But I'm not
forced into it.

>> As for where it's used, mainly DOD I'd guess.  It certainly isn't used much,
>> if any, in commercial or industrial sectors.  C's the language for most of
>> these, though your Fortran experience could come in handy in some heavy
>> duty scientific fields (most machines have Fortran compilers that code better
>> than existing C compilers, but since there's more work being done on C, I
>> wouldn't be surprised if this is changing).  

Forgot the Business sector's COBOL, and of course the prevalence of BASIC in
the homes.  I wonder who programs the most in what (ignoring whether they
do it for fun or profit).  I bet BASICs the closet to COBOL.  Tells you that
acceptance of a language rarely has anything to do with the language's
overall quality; it's usually chosen for one or two reasons that don't always
apply to the other things its used for.

>   I get irked when I hear this. This may have been true a year ago, but no
> longer. Many customers (non DOD or NATO) are requesting Ada because of the
> software engineering that it dictates. 

And many organizations forced to use Ada are getting around this by using
C to Ada translators.  Which I'm sure we both agree is a dumb idea, but
they're still being used.  And my point here is that I can engineer my software
properly in C, even though it makes no attempt to force me to do so.  And I
can engineer my software poorly in Ada, even if it tries to make me do it
correctly.  

>   I am part of a large Ada project that is building a tool for use by our
> engineers (industrial, commercial, as well as DOD contracts). After coding
> this tool, we determined that some inefficiencies dictated a design change.
> This design change proliferated itself through the entire system. The coding
> for this change took about a man-month, but the debug and reintegration
> phase took only two days. The system ran as before, must more efficient
> due to the design change. Had the code been written in C, this would not
> have happened. Many of the errors that were interjected by the engineers
> were due to spelling, wrong variable selection, or unhandled boundary
> conditions. These errors would not have come out during C compilation.

Now who's talking about things that are no longer true.  The Lattice Compiler
I have on my Amiga, while by no means an up-to-date release, will very good
type checking and would have caught most of these problems.  Unfortunately
PCC doesn't do this yet.

> Eric Beser
> EBESER@ADA20   - arpa
> {mimsy}!aplcen!cp1!sarin!eric - usenet
-- 
Dave Haynie     Commodore-Amiga    Usenet: {ihnp4|caip|rutgers}!cbmvax!daveh
"The A2000 Guy"                    PLINK : D-DAVE H             BIX   : hazy
     "God, I wish I was sailing again"	-Jimmy Buffett, Dave Haynie

chuck@amdahl.amdahl.com (Charles Simmons) (08/26/87)

In article <6338@brl-smoke.ARPA> gwyn@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes:
>The language the code is written in is really a very small part of
>structured software methodology; one can apply the techniques in
>COBOL, PL/I, FORTRAN, Pascal, or C.  As a matter of fact, there are
>C environments for sale that are just about as "safe" as Ada; most
>typo and similar errors are caught by C compilation and "lint";
>unhandled exceptions in C typically produce an immediate core image
>for debugging, etc.  Very little difference, really.

Um...  How do these C environments detect subscript range errors
in a piece of code like:

        strcpy(s,t)
        char *s, *t;
        { while (*s++ = *t++); return s;}

Although an unhandled exception in C "typically" produces a dump,
it does not "always" produce a dump.  Also, the dump may occur long
after the invalid code was executed, making it more difficult to figure
out what went wrong.

Yes, lint does work quite well.  Unfortunately, I come from a PL1
background where the compiler generated type-checking information
and the linker made sure everything matched.  With lint, I have to
have to perform an extra pass through my code.  Admittedly, it's not
a whole lot of work, but if type-checking were integrated with the
compiler, things would move along a little faster...

>C's one really major problem from the structured viewpoint is
>that it has only two effective levels of name space, module
>internal and global.  This makes it necessary to strictly
>allocate global names, which is an annoying but solvable problem.

Tee hee...  And here I was beginning to think the name space rules
were one of the advantages of C.  Actually, arn't there three levels
of name space?  "static global" or a variable that can only be seen
from within a file, "procedure internal" or a variable that can only
be seen from within the surrounding procedure (or "begin end" block),
and "external global" or a variable that exists everywhere.

-- Chuck

ain@s.cc.purdue.edu (Patrick White) (08/27/87)

   Excuse me for saying so, but this discussion seems to have drifted
from any space-related dicussion.  Would you please consider removing
the sci.space and sci.space.shuttle newsgroups from the Newsgroups line?
Thank you.

-- Pat White
UUCP: k.cc.purdue.edu!ain
BITNET:	PATWHITE@PURCCVM
U.S.  Mail:  320 Brown St. apt. 406,    West Lafayette, IN 47=--I

henry@utzoo.UUCP (Henry Spencer) (08/27/87)

> Well then if DEC is unenthusiastic about C the people responsible for unix
> (B&B) must be somnabulistic since the VAX C compiler beats the pants
> off of pcc.

If you read the documentation (radical notion...) you will find that pcc
was built for portability, not for dazzling code generation.  In that area,
pcc beats the pants off the VAX C compiler.  Also, are you comparing VAX C
to current versions of pcc, or to the decade-old quick-and-dirty first-cut
one from 32V that is still (last I looked) used in 4BSD?
-- 
"There's a lot more to do in space   |  Henry Spencer @ U of Toronto Zoology
than sending people to Mars." --Bova | {allegra,ihnp4,decvax,utai}!utzoo!henry

jb@rti.UUCP (Jeff Bartlett) (08/27/87)

In article <1937@sfsup.UUCP>, mpl@sfsup.UUCP (M.P.Lindner) writes:
> [ the following are excerpts - see the original articles for context ]
> 
> ...... I do take exception, however to the statements that
> Ada is an "object oriented" language.  Yes, it is, but it was implemented
> no better than that of C++.  Example:  Try to make a generic merge sort
> which can sort floats and integers.  Unless this has been fixed since I
> learned Ada, it can't be done!  The reason I was given is a generic type
> can only assume assignment and equality.  If we add constraints, we must
> specify either integral types or floating types.  Little inconsistancies
> make life hell for programmers, as well as conflicting with the stated intent
> of the methodology.
> 
> Mike Lindner
> attunix!mpl

What about ......

   generic
      type ITEM is private;
      with function GREATER(A,B : ITEM) return BOOLEAN is <>;
   package SORTING is

      type VECTOR is array(NATURAL range <>) of ITEM;

      procedure MERGE_SORT( A, B : VECTOR; RESULTS : out VECTOR );

   end SORTING;

This generic package can be instatiated for fixed, floating and record types.

Jeff Bartlett
Center for Digital Systems Research
Research Triangle Institute
jb@rti.rti.org

jym@mit-prep.UUCP (08/27/87)

> . . . but it becomes amazing when you learn that [Ada Lovelace] suffered
> from very poor health, enduring severe and nearly continuous [migrane]
> headaches.  I believe she died fairly young (mid-30s I think) because
> of her poor health.

In that case, that idiotic migrane-producing language invented by the
 COBOL-headed warheads at the DoD is well-named.
  <_Jym_>

P.S.:  C rules!  Anarchy now!  Venceremos!  Etc.!  Have a nice day!
-- 
jym@prep.ai.mit.edu			-- A STRANGE GAME.
--
.signature file under construction	-- THE ONLY WINNING MOVE IS
--
read at your own risk			-- NOT TO PLAY.
--

peter@sugar.UUCP (08/28/87)

The basic problem with Ada is that it's too big. The basic problem with 'C'
is that it's too loose for a horde of average programmers to use safely.

What's wrong with Modula-2? As near as I can tell it has all the advantages
of ADA, and few of the disadvantages. That's my main objection to the language:
it's in use dispite the existence of a superior alternative.

I, of course, will continue to use 'C'. I flatter myself that I am a better
than average programmer. And I'm not a horde.
-- 
-- Peter da Silva `-_-' ...!seismo!soma!uhnix1!sugar!peter
--                  U   <--- not a copyrighted cartoon :->

peter@sugar.UUCP (Peter da Silva) (08/28/87)

In article <138@kesmai.COM>, dca@kesmai.COM (David C. Albrecht) writes:
> Well then if DEC is unenthusiastic about C the people responsible for unix
> (B&B) must be somnabulistic since the VAX C compiler beats the pants
> off of pcc.

If the VAX 'C' compiler (presumably the gode gen) beats the pants off PCC,
the runtime must be a DOG. VAX 'C' is the worst 'C' development environment
it has ever been my misfortune to work under, both because of the poor
support for VMS in VAX C, and because of the incredibly poor performance
of the resulting programs. I'll take BDS 'C' for the Z80 any day over
DEC's "product".

DEC has indeed been unenthusiastic about 'C', and anything else to do
with UNIX. The old NIH syndrome.
-- 
-- Peter da Silva `-_-' ...!seismo!soma!uhnix1!sugar!peter
--                  U   <--- not a copyrighted cartoon :->

ram%shukra@Sun.COM (Renu Raman, Sun Microsystems) (08/28/87)

In article <8495@utzoo.UUCP>, henry@utzoo.UUCP (Henry Spencer) writes:
> > Well then if DEC is unenthusiastic about C the people responsible for unix
> > off of pcc.
> 
> If you read the documentation (radical notion...) you will find that pcc
> was built for portability, not for dazzling code generation.  In that area,
> pcc beats the pants off the VAX C compiler.  Also, are you comparing VAX C
> to current versions of pcc, or to the decade-old quick-and-dirty first-cut
> one from 32V that is still (last I looked) used in 4BSD?
> -- 
> "There's a lot more to do in space   |  Henry Spencer @ U of Toronto Zoology

	A few days back, I had been to a talk by Richard Stallman (of GNU fame)
	He told that the gcc (Gnu C Compiler) generates better code than
	Greenhills C & Tartan C - Now that is something amazing.  
	(I doubt if VAX C can be better than Tartan - which does some
         fancy optimizations and VAX C is non-portable)
	Definitely a "truly" retargetable compiler generating better code
	than Tartan/Greenhills speaks for itslef.  has anybody done any
	comparison with the VAX C compiler?

	Sorry Ada guys - this is not relevant here - but you might excuse me
	for once :-)

---------------------
   Renu Raman				ARPA:ram@sun.com
   Sun Microsystems			UUCP:{ucbvax,seismo,hplabs}!sun!ram
   M/S 5-40, 2500 Garcia Avenue,
   Mt. View,  CA 94043

henry@utzoo.UUCP (Henry Spencer) (08/29/87)

> Um...  How do these [safe] C environments detect subscript range errors
> in a piece of code like:
> 
>         strcpy(s,t)
>         char *s, *t;
>         { while (*s++ = *t++); return s;}

Easy, a pointer becomes a non-trivial data structure that carries bounds
with it; those bounds are checked when it is used.  Remember that pointer
arithmetic is technically legal only within a single array.  Getting the
little details right must be a bit tricky in spots, but it does work.

> Although an unhandled exception in C "typically" produces a dump,
> it does not "always" produce a dump.  Also, the dump may occur long
> after the invalid code was executed, making it more difficult to figure
> out what went wrong.

This property is shared by *all* programming languages that can generate
run-time exceptions; usually the actual exception shows up at some remove
from the bug that actually caused it.  Unless Ada is much more radical
than I remember (it's been a long time since I read the definition),
it is just as subject to this problem as C.
-- 
"There's a lot more to do in space   |  Henry Spencer @ U of Toronto Zoology
than sending people to Mars." --Bova | {allegra,ihnp4,decvax,utai}!utzoo!henry

gwyn@brl-smoke.ARPA (Doug Gwyn ) (08/30/87)

In article <8495@utzoo.UUCP> henry@utzoo.UUCP (Henry Spencer) writes:
>...to current versions of pcc, or to the decade-old quick-and-dirty first-cut
>one from 32V that is still (last I looked) used in 4BSD?

Actually, a reliable source told me that the basis for the PCC shipped with
4BSD was a bootleg copy from inside Bell Labs, not the 32V version.  Last I
looked at an old 4BSD PCC, it looked like it was circa USG 3.0; certainly
some of its bugs dated back to that time.  (Donn Seeley and others have
since put a lot of work into improving the 4BSD PCC.)  Supposedly Berkeley
having bootlegged software had something to do with AT&T being permitted to
distribute code such as "vi" that originated at Berkeley.

The above is somewhat more than a rumor but somewhat less than second-hand.

dhesi@bsu-cs.UUCP (Rahul Dhesi) (08/30/87)

In article <584@sugar.UUCP> peter@sugar.UUCP (Peter da Silva) writes:
>[VAX/VMS C] is the worst 'C' development environment
>it has ever been my misfortune to work under, both because of the poor
>support for VMS in VAX C, and because of the incredibly poor performance
>of the resulting programs.

I think the $39.95 MIX C compiler for MS-DOS is somewhat worse.

Even a very careful reading of the VMS C documentation and release
notes reveals absolutely no bugs or deficiencies in the VMS C
environment.  If anything, as the manual makes clear, VMS C adds
functionality to the original UNIX implementation.

Compare this with the large number of bugs and deficiencies that are
documented in the manual for almost every UNIX program.

Also note that VMS C runtime environment automatically converts all
command-line arguments to lowercase, thus greatly simplifying argument
parsing.  And no lint-like program is provided, saving you the
temptation of using one and having to face the rude diagnostics it
would probably give you.

But perhaps the most outstanding advantage of VMS C environment is that
the cursor control routines require a terminal manufactured by DEC or
something equivalent.  This saves no end of trouble--no more time
wasted having to create termcap entries for strange terminals of
questionable quality.
-- 
Rahul Dhesi         UUCP:  {ihnp4,seismo}!{iuvax,pur-ee}!bsu-cs!dhesi

sns@genghis.UUCP (08/31/87)

In article <1069@bsu-cs.UUCP>, dhesi@bsu-cs.UUCP (Rahul Dhesi) writes:
> Even a very careful reading of the VMS C documentation and release
> notes reveals absolutely no bugs or deficiencies in the VMS C
> environment.  If anything, as the manual makes clear, VMS C adds
> functionality to the original UNIX implementation.

Bugs are revealed in using the compiler, not in reading the documentation.

> Compare this with the large number of bugs and deficiencies that are
> documented in the manual for almost every UNIX program.

Yes, please do.  They are documented; you can work around them, or (if you
have source) you can fix them.

> Also note that VMS C runtime environment automatically converts all
> command-line arguments to lowercase, thus greatly simplifying argument
> parsing.  And no lint-like program is provided, saving you the

Converting to all lowercase is an advantage?  Try implementing a full-function
ls (an esoteric command, I give you :) with all lowercase options and without
arbitrarily assigning letters to options.  I like single-letter flags, thank-
you.

> temptation of using one and having to face the rude diagnostics it
> would probably give you.

I agree.  It is definately a nuisance to have to worry about all the bugs
that lint could uncover.

> 
> But perhaps the most outstanding advantage of VMS C environment is that
> the cursor control routines require a terminal manufactured by DEC or
> something equivalent.  This saves no end of trouble--no more time
> wasted having to create termcap entries for strange terminals of
> questionable quality.

If you want to limit yourself to DEC terminals, then go ahead - hard code
the DEC control sequences into your programs.  Don't limit us to only those
terminals. /etc/termcap (or terminfo) is a feature, not a disadvantage.
-- 

Sam Southard, Jr.
{sns@genghis.caltech.edu|sns@genghis.uucp|{backbone}!cit-vax!genghis!sns}

mpl@sfsup.UUCP (08/31/87)

In article <1678@rti.UUCP>, jb@rti.UUCP writes:
] > [I wrote this - M. P. Lindner]
] > [ the following are excerpts - see the original articles for context ]
] > 
] > ...... I do take exception, however to the statements that
] > Ada is an "object oriented" language.  Yes, it is, but it was implemented
] > no better than that of C++.  Example:  Try to make a generic merge sort
] > which can sort floats and integers.
] 
] What about ......
] 
]    generic
]       type ITEM is private;
]       with function GREATER(A,B : ITEM) return BOOLEAN is <>;
]    package SORTING is
] 
]       type VECTOR is array(NATURAL range <>) of ITEM;
] 
]       procedure MERGE_SORT( A, B : VECTOR; RESULTS : out VECTOR );
] 
]    end SORTING;
] 
] This generic package can be instatiated for fixed, floating and record types.
] Jeff Bartlett

But if I have to make a GREATER function I could have done it in C (see man
page for qsort(3)!  Therefore I maintain my claim that Ada is *not* a proper
implementation of an object oriented language (like Smalltalk).

Mike Lindner
attunix!mpl

leonard@bucket.UUCP (09/01/87)

Uh, folks? This discussion started with someone having an interest in
programming for NASA. Thus it was cross posted to sci.space & sci.space.shuttle.

Unless your followups have something to do with the space program,
_please_ edit the newsgroups line.

	Thank you.


-- 
Leonard Erickson		...!tektronix!reed!percival!bucket!leonard
CIS: [70465,203]
"I used to be a hacker. Now I'm a 'microcomputer specialist'.
You know... I'd rather be a hacker."

barmar@think.COM (Barry Margolin) (09/02/87)

In article <1963@sfsup.UUCP> mpl@sfsup.UUCP (M.P.Lindner) writes:
>In article <1678@rti.UUCP>, jb@rti.UUCP writes:
>]    generic
>]       type ITEM is private;
>]       with function GREATER(A,B : ITEM) return BOOLEAN is <>;
>]    package SORTING is
>] 
>]       type VECTOR is array(NATURAL range <>) of ITEM;
>] 
>]       procedure MERGE_SORT( A, B : VECTOR; RESULTS : out VECTOR );
>] 
>]    end SORTING;

>But if I have to make a GREATER function I could have done it in C (see man
>page for qsort(3)!  Therefore I maintain my claim that Ada is *not* a proper
>implementation of an object oriented language (like Smalltalk).

Wait a second.  How can you write a generic sort in any language
without requiring that there be an ordering predicate for the element
type?

This is not like qsort(3).  Qsort requires that the ordering predicate
be specified in the call.  The above Ada code merely specifies that
the generic may only be instantiated for types for which GREATER (A,B:
type) is defined.  Smalltalk and Flavors have the same requirement,
but they don't notice the missing method until runtime.

---
Barry Margolin
Thinking Machines Corp.

barmar@think.com
seismo!think!barmar